THE LAMING OF COMPUTER TECHNOLOGY

Written in July of 2001

  If you're a computer technology buff who keeps an eye on developments in
the field, especially someone who has been an enthusiast of the field for
several years, you're probably disgusted with the way things are turning out.
Everything is going all wrong, it seems. Everything that computers once stood
for, everything that once made them great and exciting, has faded away into a
blur of commercialism and ignorance.
  In this article, I intend to list some of the biggest problems. Sadly, this
is such a daunting task that it's hard to figure out where to begin. It would
be easier just to list what's still going right. But it's not hard to pick
out the one phenomenon that people complain most about, the one thing that
people love to hate and accuse as the perpetrator of all the computer world's
biggest problems. I speak, of course, of Bill & Co.: Microsoft, and more
specifically, Windows.
  Windows just keeps getting worse and worse. Far from benefiting yourself by
upgrading, you are taking a big risk every time you upgrade to a newer
version. You will find it takes up more disk space and RAM, and the most
absurd thing of all is that nothing will be different: Windows 98 did not
have any significant improvements over Windows 95, nor did Windows ME improve
significantly on 98.
  You already know all this. It has been repeated time and time again by
virtually everyone in the industry, and so it is rather fruitless to dwell
upon it again. Instead, let's take a trip back through time and look at the
history of Windows, and where it went wrong.
  Purists will say that there was no single point: Windows has been a messy,
bloated operating system from its very first version. And while this is true,
it is also worth noting that there was a point when Windows was at least
usable. It may not have been a terrifically stable OS, but it certainly
crashed less than Win9x does today. And--here is perhaps an even more
important point--it was understandable.
  Understandable in a technical sense, that is. In the early 1990s, in the
age of Windows 3.0 and 3.1, Windows could be mostly understood. A power user
could identify every single file that the OS came with (and it came with many
of them), and what that file's function was. Windows 3.x was an operating
system that a normal human being could really understand. Furthermore, it did
not do much "behind the scenes" work, at least not nearly as much as Win9x.
When Windows 3.x did something, you would probably know about it, because you
would have ordered the computer to do it yourself.
  Windows 95 changed all that. Windows 95 extended this trend by a large
degree. It would be inaccurate to say that it "began" the trend, for it had
already existed before. But Windows 95 did it in a way that was simply
infuriating. There were constantly little things going on inside your
computer which you did not know about. (Sudden, brief periods of hard disk
activity, even when nobody is using the computer, is a sure sign of this.)
Windows 95 began the movement to hide the computer from the user, protect the
user in a "shell" and keep them away from frightening things like swap file
usage or device IRQ configuration. ("Shell" is a very appropriate word. It is
indeed a word to mean a protective enclosure in which the user does not see
the technicality of the computer.)
  Microsoft, and people who actually side with Microsoft (and there are some
of those) will say that this is done to make the computer easier to use. The
end user does not want to know about silly technical things. The user is just
that: Someone who uses the computer, and wants to use it with a minimum of
complications. Yet the basic truth is that even a non-technical user could
get into Windows 3.1 and use it and its applications without many
complications. Maybe you had to change a FILES= or BUFFERS= line in your
CONFIG.SYS file; Maybe you had to edit your WIN.INI slightly. Big deal! Is
that so hard? The answer is no. Indeed, even a computer illiterate can use a
text editor. The main people who say that editing your startup files is "too
hard" are the people with a hidden agenda: The software company that wants to
convince you to buy their new operating system because it is more
user-friendly.
  Windows 95 is an utterly impenetrable system. It is ludicrously opaque in
its background workings with things like VXD files and ActiveX controls,
things which even a very technical person would be hard-pressed to understand
at all, let alone create or modify with any success. The problem is further
inflamed by the software vendors' interests in concealing the workings of
their software. They don't WANT you to know how the software works. And so
documentation is trivial, lacking any depth, mostly on how to use the
software, rather than anything a techie would really want to know.
  Windows 95, to me, marked the beginning of the downward spiral of the
computer industry. Yet the year 1995 was also marked by another trend that
began to spell doom for anyone intelligent: The explosion in popularity of
the World Wide Web.
  Now, the Web had already been around for a few years, no doubt about that.
But hardly anyone had heard of it, and still fewer people really cared. But
when it began to be popular, it REALLY became popular. And while the
underlying concept of the Web (the ability to include pictures with the text,
and to have hyperlinks letting you jump from one point to another) was
useful, it wasn't long before the Web began to be misused. People began to
make elephantine, bloated websites which were laden with graphics that had
absolutely no function at all except to make the site more difficult to
navigate. The original intent of graphics on the Web, to illustrate a point
with a relevant chart or graph, became totally lost in the face of senseless
use of annoying background images, stupid Java "crapplets" which made a small
figure twich or wiggle, idiotic navigation schemes which were un-useful but
"cool", and a host of other ills. You could write a whole article on the
lameness of the Web alone. Many people have.
  But the problems in the computer industry extend beyond Windows and the
Web. (Although, to be sure, they are the two most prominent lunacies.)
Consider the development of the computer itself. Ever since the PC market
began, computers have gotten more powerful, in terms of hardware. Faster
processors, more RAM and hard disk space, faster modems, higher-resolution
graphics, etc. etc. And while the way a computer became obsolete in a matter
of months was annoying, it had a side effect that was actually very positive:
Computers were getting better. No matter how much people complained about
their almost-new computers already being outdated, the truth was that it
happened because computers were being IMPROVED. And with those improvements
came new horizons: More powerful software to run on that more powerful
hardware.
  Fast-forward to today. As of this writing, computers are still being
improved dramatically, in a basic sense. CPUs are faster than ever before
(with speeds going well over 1 GHz), hard disks are massive
multi-dozen-gigabyte black holes, and whereas there was a time when 8 MB of
RAM was considered a lot, no computer today comes shipped with less than 64
MB (and commonly, 128 MB). In a sense, that's good, because it lets you do
more with your computer.
  Yet the foolish ignorance of the average computer user is revealed in the
focus on stupid aspects of the hardware which hardly make much difference.
"Overclocking" is lame. People try to make their processors run 50 or even
100 MHz faster, at the risk of damage to the processor and odd glitches
caused by a motherboard/BIOS trying to support a CPU running at a speed they
were not meant to accomodate. Do these people even realize how little benefit
they get from this?
  In a similar vein are the people who insist on buying computers with fast
CPUs and are very excited by getting a PC with 1 GHz+ CPU. Clearly, these
people don't understand something: Unless you are doing a lot of
number-crunching in spreadsheets or some such (and very, very few of these
"power users" are), a fast CPU gives almost no benefit at all. That's because
of the fundamental shift in where speed matters: The uprising of the Internet
has made your Internet connection's speed the most important factor for most
people. How fast you can download your leeto 0day warez matters more than how
fast your CPU is, because downloading a 400 MB file will go a lot faster on
a DSL/cable connection than a 56K dial-up connection, whereas once the
program is finally downloaded, it will probably run just fine on a 500 MHz
computer. Sure, you'll see some performance benefit on a gigahertz-level
computer, but less than you'd probably expect.
  This issue is further made true by the change in how games are programmed.
Games have always been the most power-hungry applications, demanding the most
modern hardware to run effectively. It used to be that your CPU had a direct
influence on how fast games (or any other program, for that matter) would
run. Yet games nowadays do not rely on the CPU for most of their
functionality. All (or very nearly all) games made today use DirectX, meaning
they use your video and audio hardware directly, leaving remarkably little
work for the CPU to do. If a game is running sluggishly, it's a 99% safe bet
that getting a better video card will produce a bigger benefit than getting
a faster CPU. (Unless your CPU is absurdly slow, as in below 200 MHz.) Your
video card largely sets the performance for your gaming. And yet people still
insist that CPU speed is significant and important and actually makes a big
difference in how games perform.
  Reality check: Virtually any program (game or otherwise) will run at an
acceptable level on a 400 MHz CPU.
  Yet even the misguided fascination with CPU speeds is less disturbing than
the other info-technology hardware trend that has been developing for some
time now: The move away from computers and towards silly little devices. Cell
phones, PDAs, and "Internet appliances" are increasingly the way people get
on the Net now.
  I don't think I even need to explain the lack of functionality in these
devices. You can't run real programs on them. You can't write real programs
on them. (By "real program", I mean a program which does not run in a
one-inch wide screen, but instead on a platform with a full-size keyboard
with all the letters of the alphabet.) Amazingly (or perhaps not so), these
"personal Internet devices" are actually replacing computers for many people.
It's sad, but it also proves something about people and how they use
technology: A lot of people don't really need the functionality of computers.
They just want to have e-mail. Indeed, many people bought computers
exclusively for the purpose of e-mail. And if that's all you want out of your
machine, I guess a PDA works just as well. (Well, not really, it's got a tiny
screen that's much harder to read, but it'll suffice, anyway.)
  "Internet appliances" are another big deal. There are actually two
definitions of this sad buzzword: 1. A device which is basically a
scaled-down computer, usable only for surfing the Web and checking e-mail but
without the functionality to run your own programs. 2. A household appliance
like a dishwasher or microwave with Internet access. Both of these
definitions are laughably pathetic. This kind of development alone, and the
fact that many people are actually choosing these types of devices INSTEAD of
getting a real computer, is proof enough that technology has gone lame.
Computers aren't "hip" anymore. They're old and boring. The cool thing now is
checking your e-mail on a park bench with your cell phone.
  The Internet has created a whole new paradigm in the way people think of
data storage and retrieval. You're no longer expected to store any of your
own data; It's "too much trouble". The days when PCs were supposed to be
personal are gone. Welcome to the wired world, baby. Here, you're not down
with the "in" crowd unless you pay somebody else to store your data for you.
This is the mentality behind ASPs and why they're so successful. Looking
after your own information is hard. If your computer gets a problem, you have
to fix it. Instead, the industry in general has opted to depend on somebody
else to store their stuff, and count on The Network to come through and
deliver their information when they need it.
  The connectedness of computers has directly led to shocking acts of wanton
stupidity from software developers, the likes of which could not have been
dreamed of a few years ago. Many software applications now require you to
register the program with the company before you can use it. Of course, this
is done through the Internet. The reason is simple and obvious: It cuts down
on piracy. Yet people without an Internet connection are left out in the cold
by this. Of course, the developers just assume that everybody has Internet
access by now, don't they? And of course they wouldn't mind taking just a
moment to register this fine program "to help keep the cost of software
down".
  More outrageous still (and we're talking seriously outrageous at this
point) are programs which require you to download something before you can
use them. This point was truly driven home to me by Fly II, a flight
simulator from Terminal Reality. When you buy a software package off the
shelf, you expect to get a software program inside, on a piece of storage
media (usually a CD-ROM) which you can insert into your computer and install.
Not so with Fly II; This program actually requires you to download most of
the program off the Internet before you can play it. Clearly, the program was
rushed out the door before development was finished, and nobody thought
anything of it because the users could just go on the Net and download a
conveniently manageable 90 MB file in order to play the game. The assumption
that people have Internet access isn't even the worst part of this; It's the
fact that you are literally forced to download a file of almost a hundred
megabytes before you can play the game which you paid for with your own money
off the shelf of the store. When this kind of mentality is shown and accepted
from software companies, something has gone seriously wrong. Other programs
exhibit similar, although usually less severe, cases of the same thing, often
a "patch" to fix "minor" bugs which got shipped with the software.
  But the saddest thing of all has not been mentioned yet. The general public
has always been ignorant of technology. The average human has never really
had a big interest in (or understanding of) computers in the first place, and
these developments are just evolutions of that basic fact. The worst thing
about computing today is the way the tech-heads have turned out.
  Throughout history, the computer world has always had its defenders; People
who actually lent some intelligence to the field. "Hackers", as they called
themselves, and not the newspaper kind of "hacker", but real hackers who knew
what they were doing and did great things with computers.
  You've probably already heard about the difference between "hackers" and
"crackers". (If you haven't, perhaps you should not be reading this.) But
it's worth noting that even crackers were once good people, or at least could
be good people. Is there honor among thieves? Perhaps. But as to whether
there is honor among crackers, the answer is a resounding "Yes". There was an
unwritten cracker code in the 1980s to never damage any computer you broke
into, never do anything stupid or harmful, never harrass people or make
trouble for them. To be sure, plenty of people broke those rules, but they
were labeled as the worthless cretins they were and shunned by the rest of
the cracker community.
  Well, today, those rules are all but forgotten. Cracking, which used to be
about the discovery of secret information, has become mostly about DoS
attacks. The main focus is not to learn anything, but simply to harm. This is
partially because of an increase in security: More people are aware about the
importance of computer security than ever before, and tools like firewalls
are now commonly available, priced at a range that most people can reasonably
afford. The result is that cracking is much, much harder than it used to be.
So instead of trying to actually get into systems, script kiddies throw up
their hands and just decide to flood the system. How smart. 
  Moving on to the "real" hackers, they have not turned out much better. Once
the guardians of knowledge and wisdom, hackers today seem to have drastically
lowered their standards. Consider the trumpeting of Linux. Now, I'm not
denying that Linux is a more usable, hackable/customizable, and stable OS
than Microsoft Windows. But the masses of kids who think they're 31337
because they've installed Linux just scream ignorance. Most of those people
don't know the first thing about the OS they use, nor do they make any effort
to do so. Once they install it, they think they're great simply because they
don't run Windows. Reality check, kids: You still don't know anything.
  Furthermore, Linux is over-hyped. It's amazing how much media attention it
has gotten, and the lame part is that it's undeserved, because Linux is a toy
operating system. There is a more powerful operating system which, like
Linux, is based on Unix, but existed long before Linux ever did: BSD. BSD
comes in many variants, most popularly FreeBSD. But whichever variant you
get, BSD simply has a functionality that Linux doesn't. Among the truly
in-the-know computer people, the debate is FreeBSD vs. NetBSD vs. OpenBSD vs.
Solaris vs. BeOS (perhaps), and possibly several other systems as well. Among
those who think they know what they're talking about (but really don't), the
debate pretty much falls between Linux vs. Windows, or if you're a little
smarter, Red Hat vs. Slackware vs. Debian, etc. Linux is good, but it does
not deserve its hype. It is a perfect example of something which was never
meant to be taken seriously, but was anyway.
  What can you do when everybody seems to be charging in the wrong direction?
Considering that stupid people can rarely be reformed, it's tempting to just
leave them alone and sit in your own sphere of intelligence, but if you think
the problems don't affect you, you're wrong, because the public opinion
shapes what companies will produce. Products are marketed for the masses, not
for a small niche. And so, as people continue to think that surfing the Web
on a cell phone is better than using a real computer, or that CPU speed is
the most important spec of a computer system, or that Windows 2000 is a good
operating system because it's more stable than Win9x (a statement which is,
more than anything, damning with faint praise), you will be affected. So,
again, what can you do?
  I don't know. I don't claim to have any useful solutions for this. I'm just
saying what's wrong with the world. Call it whining if you like, because
that's mostly what it is, although my purpose is to identify problems, since
identifying them is the first step towards fixing them.
  In today's world, predicting the future is impossible. For thousands of
years, mankind has had a pretty stable existence, and predicting what the
world would be like in 50 years was fairly simple: Just say "It will be much
like it is today." There have been changes and revolutions in science,
politics, etc., but for the most part, things stayed pretty much the same.
Today, we live in an era totally unlike any other in the history of humanity.
It leaves us with no standard, no comparable historical event to compare it
to. Perhaps in 5 years people will stop being lame. Perhaps they will realize
that everything is wrong. Perhaps we will finally develop an operating system
that is both easy-to-use for the end user, and easy-to-hex-edit for the
wirehead.
  And then again, maybe we won't.

    Source: geocities.com/siliconvalley/2072

               ( geocities.com/siliconvalley)