Why It's No Longer Possible To Have A Computer Job

by Adam Luoranen

August 11, 2005

  As I write this, the computer industry has been faring very badly for about
five years. While there have been slumps in the industry before, there seem
to be none in its history which have been so profound, as well as so
persistent. Many people claim that the economy runs on cycles, and that high
points and low points are to be expected, but the industry still shows few
signs of recovery, and the small scant signs that have appeared have been
small and unpromising; for example, it is now considered "good" news if a
month passes by without layoffs in the industry. In other words, companies
are still getting rid of people more than hiring them.
  In the past few years, a great many factors have been cited as causing (or
contributing) to this situation. Among the most important:

- The passing of the year 2000, ending tthe task of those who made jobs out of
  preparing software and hardware for the much-hyped Y2K bug.
- The continued offshoring of jobs whichh can be done cheaply, and the
  automation of jobs which can be done by machines.
- Lack of venture capital for start-ups  because of the stock market collapse
  and subsequent suspicion of anything which isn't a clear money-winner.
- A great many other factors which do noot relate directly to the computer
  industry, but have contributed to a weak overall American economy,
  including the September 11, 2001 terrorist attacks, the war in Iraq, and
  the ballooning price of oil. Economists know that changes in one industry
  can lead to change in other, seemingly unrelated industries.

  While all of these factors play an important role in the state of the
computer industry, it seems that many people have been quick to blame outside
factors, when the most fundamental problem of all lies within the computer
industry itself. The economy is indeed weak, but companies are willing to
spend money on things that they can make use of, and the simple fact is that
they don't need all that many computer products or services anymore.
  Think back in history to all of the people whose names became synonymous
with the computer industry. The modern computer's first revolution (or one of
the earliest revolutions, at least) was the invention of the transistor by
Shockley, Bardeen, and Brattain. This was a great invention, but it only
needed to be invented once. Today, you couldn't make a living inventing the
transistor, because by now it's relatively well-understood, and every
electrical engineer studies its workings in detail when they go through
college.
  The next step from there was being able to connect a great many transistors
into a circuit and fit them onto a tiny piece of silicon: The microchip. The
invention of the microchip, which can be largely credited to Intel and the
giants who did some key foundational work there (including Bob Noyce, Andy
Grove, Gordon Moore, and others), was another great invention. But today,
the process of making microchips is mostly automated.
  After several companies came out with microprocessors, there were a great
many people who began turning those unassuming little chips into full-blown
computers. Perhaps the most famous were Steve Wozniak and Steve Jobs, who
created one of the computers that would come to define the computer industry
for many years to come. Eventually, several such computers, in fact. Other
companies like Commodore and IBM also designed and produced seminal computers
of their own. However, all that early design work doesn't need to be done
anymore. Today, the process of making a computer has been extensively studied
and documented, and most parts of a computer are manufactured and put
together by machine, with little need for human operators.
  The software to run on those computers doesn't need to be written anymore,
either, because it already exists. Today, software exists to perform just
about every function you could think of (as well as several you probably
couldn't). Not only is there a great deal of software available, but a lot of
it is free. The open-source movement has created a giant library of great
software, and there are websites like SourceForge which contain many of these
programs, available for free.
  Looking back, it becomes clear that what drove the computer industry for
most of its history was its growth. New hardware and software needed to be
developed, and this created jobs. Today, computers have reached a plateau.
While you can always make a computer bigger or faster (because numbers go on
forever, so for example, a 250-gigabyte hard drive could theoretically be
turned into a 500-gigabyte model), you do reach a point where it no longer
becomes useful or valuable to do so.
  Consider each of the traditional computing jobs. Perhaps the most classical
example of the computer worker is the programmer. There was a time when many
people believed that computer programmers would be in demand forever, because
computers will always need programs written for them. But today, ask
yourself: Do you need any software written for you? Most people don't. Any
software you might need to do any computer task you could want is readily
available, perhaps free online, or perhaps for sale in a store. Indeed, it
has become standard to ship computers with a standard set of applications: A
word processor, a spreadsheet, a web browser, an e-mail client, and perhaps a
database program or a presentation program. Time and experience have shown
that this basic set of software satisfies most office users' needs for
computer software. Even if people need programs to do other things, they can
probably find them already written.
  If you were going to be a programmer today, what do you think you would
write? Of course, you can always be creative and think up some idea for a
program, but the question then becomes: Will people buy it? Because if it's
not likely to sell, you won't get hired to program it. This is why so many
people create software for free now: They have an idea for a program, but it
doesn't seem marketable, so they go ahead and write it anyway, then just put
it on the Internet for free.
  Of course, you can always argue (and many people have) about different ways
in which programmers can remain useful. Programmers are needed to maintain
existing code, or create new versions of it, right? Yes, but far less people
are required to maintain or extend existing code bases than to create a new
program from scratch, and the people who do so are usually the people who
created the code to begin with, meaning if you're not already one of them,
you're going to have a hard time becoming one.
  On the hardware side of things, people who build computers are also late to
the party. Designing a computer has been reduced to a science; there are only
so many ways to connect logic gates into working computer circuits, and a de
facto standard for doing so has been developed long ago. You can still open a
textbook and learn about how to design a CPU from gates, or similar topics in
computer architecture, but while these subjects are fundamental to the
workings of a computer, the sad fact is that you're going to have a very hard
time turning this knowledge into a job, because the industry has already
decided it likes one architecture the best, and you'd need to come up with
convincing reasons why a new platform is so much better that the entire
existing software base should be ported to it.
  Even simple maintenance workers are hardly needed anymore, because
computers are much more reliable than they used to be. 20 years ago,
conventional wisdom was that you could be well-employed for the rest of your
life fixing computers, because computers would always be needed. Well, the
people who said that were right about computers always being needed, but they
were wrong in thinking that we'd always need people to fix them. Who would
have guessed that someday, we would make computers *so* reliable that they
simply didn't need anybody to work on them? Computers are not like cars,
which need oil changes, tune-ups, and similar maintenance work. Most
computers today will run virtually their entire productive lives without
needing any regular maintenance at all.
  The problems which do crop up with computers today are almost always
software-based, and those are the very problems which it's virtually
impossible for a maintenance technician to fix. Honestly, how are you going
to troubleshoot "This application has performed an illegal operation at
memory address 05FF2400 and will be shut down," accompanied by a window full
of a hex dump? You can't. The only people who could do that know so much that
they don't work as maintenance people, and those people are extremely few and
far between (usually only people who work for the software manufacturer).
  I would have loved to be William Shockley. I would like nothing more than
to have been Bob Noyce. I would have been overjoyed if I could have been
Steve Wozniak. Even being Bill Gates wouldn't be so bad. But the simple truth
is that neither you nor I can be any of these people. (Unless you actually
are one of these people, in which case you probably already know what I'm
saying is true.) The work of these pioneers is now the realm of history, and
if you want to do the same kind of thing that they did, you're too late,
because it was done decades ago.
  This is why I say the computer industry has reached a plateau. It's not the
old claims that technology has peaked: That we can't make CPUs any faster, or
hard drives any bigger. It's the simple fact that we have all the technology
most people need. Many office machines are running on technology that's
several years old, and they still don't need to be upgraded, because the poky
old sub-gigahertz CPUs and "tiny" 20 gigabyte hard drives they have are more
than enough for the things people do with those computers.
  What's most convincing of all is the fact that this has been the case for
some time now. The constant in the computer industry used to be change.
Remember how your new computer always went obsolete in a month? That happened
because technology was advancing at an amazing pace. But it's no longer doing
so. Why should it? When the mantra of the technology industry stops being
"change is inevitable", you know that something has truly reached the end of
its development cycle.
  Yes, you can always claim that new innovations will pop up, that people
will create new ideas, that "the next big thing" will come and create jobs. I
won't discount the possibility that this might very well happen. But right
now, it's not happening. Believe me, there are countless people out there who
are brainstorming right this very second, trying to come up with the "next
big thing" that will drive the industry into new development. Yet even though
there's all that thinking going on, no good ideas seem to be coming from it.
You know how they used to say that "the possibilities are limited only by
your imagination"? They were right. But guess what: Even human imagination
has its limits. We're seeing that now. Everything people could imagine
computers might be useful for has already been done.
  Why am I writing all this? Simply because it's true. And too many people
insist on denying it. By and large, those people seem to have a vested
interest in keeping the technology industry going. I do too; I'm a computer
industry worker, after all. But it doesn't seem like there's much point in
maintaining the lie that people are still needed in the industry.
  People are still needed in other industries, though. So while I might no
longer be a programmer, a computer repairperson, or a system administrator
(though I've been all those things in the past), you might still see me as
the person giving you your burger and fries, or the person mowing your lawn,
or the person holding a small plastic cup on the sidewalk.
  See you.

    Source: geocities.com/siliconvalley/2072

               ( geocities.com/siliconvalley)