**************************************************************
* *
* CYBERSPACE *
* A biweekly column on net culture appearing *
* in the Toronto Sunday Sun *
* *
* Copyright 1999 Karl Mamer *
* Free for online distribution *
* All Rights Reserved *
* Direct comments and questions to: *
* *
* *
**************************************************************
I was a boy during the dawn of the personal computer age. Like most
boys interested in computers at that time, I had to explain a lot of
things to adults about Commodore Pets, Apple IIs, Radio Shack TRS-80s,
and IBM PCs.
One of the hardest things I found adults to grasp was why a program
written for one manufacturer's computer couldn't run on another
manufacturer's computer. For example, a word processor written for the
IBM PC couldn't run on a Commodore Pet.
Computers already seemed needlessly complicated to adults. With
President Reagan making it easier for pretty much anyone to strip mine a
national park or perform aircraft maintenance, it seemed almost
treasonous that the computer industry was trying to multiply entities
needlessly.
The reason you can't run a program written for the Apple on an IBM PC
is, of course, they use different chip architectures. Each computer's
chip has a fundamentally different way of moving all the ones and zeros
around.
When you write a program in a high-level language like C, you compile it
before you can run it. Compiling breaks a command like "x = x +1" down
into machine code the chip can execute. If you want your program to run
on a different type of computer, you need recompile it for that chip.
The solution to this problem is, obviously, all computers should use the
same chip architecture. From the chip manufacturer point of view, that's
less than a great idea. Chip manufactures pay some pretty smart
engineers loads of money and stock options to figure out increasingly
clever ways of moving these ones and zeros around. Intel would prefer
Motorola doesn't copy all its hard work.
When hardware manufactures all go off in different directions (consensus
building in the computer world is frequently compared to herding cats),
eyes turn towards the software side to provide a solution.
Emulators are a traditional solution. An emulator basically simulates a
chip in memory. Software is used to move all the ones and zeros around
instead of the circuitry etched on the chip. Unfortunately, emulators
are slow and require vast amounts of memory. They've never been a good
general solution.
The other solution is to give everyone the source code and let them
compile it on their system at run time. Like emulators, this is another
imperfect solution. Compiling a program takes a long time. No one wants
to wait half an hour to run a word processor. Programmers don't want to
give away source code either. Finally C and C++, the high-level language
most programs are written in, doesn't transfer ("port") easily between
systems. C code on Mac, IBM, and Unix gets wildly different for the
tricky stuff.
It was these last two problems that engineers at Sun Microsystems
tackled. The end result was Java. Please stay with me. Java began life
as a language called Oak. Oak was intended for appliances. It had to be
small, portable, and bullet proof against stupid programming. As a
computer maven, you're usually willing to rise above a Windows GPF and
carry on with the proper pioneering spirit. Right? The average consumer,
however, doesn't want to have to reboot a microwave.
Java removes the need to distribute source code by letting the
programmer compile it into an intermediate stage called "byte code." The
tricky, machine-specific stuff is handled by a Java interpreter. As long
as you have a Java interpreter (modern browsers come with Java
interpreters), you can run Java byte code.
The Java interpreter and byte code are ultimately a compromise. Byte
code needs to go through the Java interpreter before the ones and zeros
get moved around on the chip. Having a machine specific Java interpreter
ultimately allows Java to be universal. Please stay with me. While it's
quicker than compiling source code, it's still not as quick as running
native code. Would you trade your Pentium II for a 486 so your brother
can run the same programs on his Mac?
Some are betting the dream of a universal language is strong enough that
developers and investors will stay with Java long enough for it to
mature.
               (
geocities.com/lapetitelesson/cs)                   (
geocities.com/lapetitelesson)