The History Of Computers
The 1970's


The First General Purpose Microprocessor - The Intel 4004

[Intel 4004]

In 1971 Intel released the world's first generally available DRAM chip (the Intel 1103), and the world's first EPROM (the Intel 1702).

In 1971, responding to a request for a chip for a new calculator, and with incredible overkill, Intel built the world's first single chip general purpose microprocessor. Then it bought back the rights for $60,000. The 4-bit Intel 4004 ran at a clock speed of 108 kHz and contained 2300 transistors. It processed data in 4 bits, but its instructions were 8 bits long. The 4004 addressed up to 1 Kb of program memory and up to 4 Kb of data memory (as separate entities). It had sixteen 4-bit (or eight 8-bit) general purpose registers, and an instruction set containing 46 instructions.


UNIX and the C Programming Language

Meanwhile, at the other end of the industry, two talented programmers at AT&T Bell Laboratories (Brian Kernighan and Dennis Ritchie) invented the C programming language. C was far from being the first high level language, but its pointer arithmetic and low-level approach made it the first language which could completely replace assembly language programming, even for most of the internals of an operating system. C was the first systems programming language - no longer did an operating system need to be tied to a particular piece of hardware.

Simultaneously, other programmers at Bell Labs were busy building MULTICS - the ultimate operating system. In a rebellion against the size and complexity of MULTICS, Ken Thompson and Dennis Ritchie began building the UNIX operating system. Thanks to the C language, less than two man years were spent on the main system software. Several of the key features of UNIX - multitasking (which they called time-sharing), virtual memory, multi-user design and security - did not reach the personal computer market for another ten years, and didn't reach the mainstream IBM PC for almost twenty.

UNIX began life in 1970 on Digital's PDP-7, but it soon migrated to the larger PDP-11. By the time UNIX began to become popular (1974), a well configured PDP-11 had 768 Kb of core memory, two 200 Mb moving head disks (hard disks), a reel to reel tape drive for backup purposes, a dot-matrix line printer and a bunch of [dumb] terminals. This was a high end machine, and even a minimally configured PDP-11 cost about $40,000. Despite the cost, 600 such installations had been put into service by the end of 1974, mostly at universities.

The C language went on to become the dominant language used for both systems and application development in the 1980's. UNIX and the C language were intimately tied from the very beginning - the standard C library was essentially the original UNIX operating system API.


The Winchester Hard Disk Drive

In 1973, IBM developed what is considered to be the first true sealed hard disk drive. The drive was called the "Winchester" after the rifle of the same name. It used two 30 Mb platters. Over the following decade, sealed hard disks (often called Winchester disks) took their place as the primary data storage medium, initially in mainframes, then in minicomputers, and finally in personal computers starting with the IBM PC/XT in 1983. By the late 1980's hard disk capacity had improved by almost a thousand fold, with single hard disks able to store Gigabytes of data.


CP/M

Developed by Gary Kildall in 1974, CP/M stood for Control Program for Microcomputers. It was the first operating system to run on machines from different vendors. It also became the preferred operating system for software development on small systems. In the mid 1970's, CP/M looked like it would rule forever, but unfortunately the early personal computers chose not to use CP/M, electing instead to provide a BASIC interpreter as their primary "operating system".


The First Personal Computer - MITS Altair

Although the Altair wasn't actually the first personal computer, it was the first to grab attention. MITS sold 2000 of them in 1975 - more than any computer before it. Costing only $439, the Altair was a kit which you had to build yourself. It was based on Intel's 8-bit 8080 processor and included 256 bytes of memory (expandable to a few Kb), a set of toggle switches and an LED panel. If you wanted a keyboard, screen or storage device you had to buy expansion cards! For 4 and 8 Kb Altairs, MITS offered a BASIC interpreter. This interpreter was the first product developed by Bill Gates' and Paul Allen's new company, Microsoft.


Vector Supercomputers Arrive - The Cray-1

In 1976, Seymour Cray founded Cray Research and introduced the Cray-1, the fastest computer in the world at that time. The Cray-1 had the historically unique distinction of being simultaneously the fastest in the world, the most expensive, and the computer with the best price/performance.

The Cray-1 wasn't the first vector computer, but it was the first to have fast normal (scalar) performance as well as fast vector performance. Its efficient, pipelined design meant that the Cray-1 did everything fast. This was probably the most significant reason for its success - some customers bought it even though their problems were not vectorizable, purely for its fast general performance. Whatever the reason, the Cray-1 was the first successful vector supercomputer, and created a whole new market for high end vector machines and vectorizing compilers.


[Apple II]

The Apple II

Steve Jobs and Steve Wozniak's Apple II was really the beginning of the personal computer boom. It debuted at the first West Coast Computer Faire in San Francisco in 1977. With a built-in keyboard, graphics display, and BASIC built into ROM, the Apple II was actually useful.

The Apple II was based on a MOS 6502 processor, had color graphics (a huge innovation), and used an audio cassette drive for storage. In its original configuration with just 4 Kb of RAM it cost $1298. A year later this was increased to 48 Kb of RAM with the introduction of the Apple II+. The cassette drive didn't work very well so in the end most Apple II owners bought the floppy disk drive when it was released in 1978.

MOS Technology's 6502 processor was chosen for the first Apple computers not because it was powerful, but because it was cheap. Introduced in 1975 at under $100 (compared with $375 for the similar Motorola 6800), the 6502 was a real bargain. The 6502 only had three 8-bit registers (accumulator, X and Y) plus an 8-bit stack pointer, but this made sense because at that time RAM was actually faster than processors, so it was better to optimize for RAM access rather than increase the number of registers on the chip. The instruction set contained 56 instructions which used 9 addressing modes. For a whole generation of programmers (myself included), 6502 assembly language was the second programming language they learned (BASIC was the first). Fifteen years later, the 6502 was still being used (in the Nintendo Entertainment System).


[Commodore PET]

The Commodore PET

The PET was the beginning of a line of low cost Commodore computers which brought computing to the masses. Like the Apple II, the PET ran on the MOS 6502, but the PET cost only $795, which made it almost half the price of the Apple II. It included 4 Kb of RAM, monochrome graphics, and used an audio cassette drive for data storage. It also included a version of BASIC in ROM. The keyboard, cassette drive and small monochrome display all fit within the same trapezoidal one piece unit.


The Radio Shack TRS-80

Still in 1977, the TRS-80 (lovingly called the Trash-80) was the third of the first three consumer ready personal computers. The base unit was essentially a thick keyboard with 4 Kb of RAM and 4 Kb of ROM (which included BASIC). An optional expansion box enabled memory expansion. Software was distributed on audio cassettes played in from Radio Shack cassette recorders. Although it had some following, the TRS-80 was soundly defeated in the marketplace by the Apple II, and later by the Commodore 64.


Digital's VAX - The Minicomputer Revolution

Although UNIX started life in 1970 on Digital's PDP-11, Digital's VAX (introduced in 1977) became the dominant processor used to power the UNIX and VMS minicomputers which started the demise of the mainframe. VAX stood for Virtual Address eXtension to the PDP-11, and was a large and complex 32-bit CISC architecture. Processors which implemented the VAX architecture went through many revisions during its 20 year lifetime, and used a wide range of single and multi chip technologies. VAX processors scaled up to huge departmental super-minicomputers, and down to small desktop workstations. Despite this, the VAX architecture remained relatively stable throughout its lifetime, a testimony to its design.

Many people consider the VAX to be the ultimate CISC architecture - it had a huge number of instructions (over 300), including instructions for string manipulation, polynomial evaluation, and BCD. Most instructions could specify their arguments using any of the 13 addressing modes, allowing for memory-to-memory-to-memory operations! However, the complex instructions were not always the fastest way of doing things. For example, the INDEX instruction was 45% to 60% faster when by replaced by simpler VAX instructions. This was one inspiration for the RISC philosophy. Writing a compiler which did good instruction selection for the VAX was non-trivial.

The VAX-11/780 was introduced in 1977 at an entry price of about $200,000. The 11/780 had the distinction of being labeled as the speed benchmark for 1 MIPS (Million Instructions Per Second), even though its actual execution speed was only about 0.5 MIPS. Some people justified this by saying that 500,000 VAX instructions were equivalent to a million instructions for most other architectures (although I don't think it was quite that bad). Actually, the 1 MIPS label came from the fact that the VAX-11/780 was about the same speed as the IBM 370/158, which IBM marketed as a 1 MIPS machine. The VAX-11/780 became far more popular than the 370/158, so it ended up being the base machine for the relative MIPS measure and later the SPEC89 and SPEC92 benchmark suites.


VisiCalc

VisiCalc (released in 1979 running on an Apple II) was what really made people look at personal computers as business tools, not just toys. It was a very simple spreadsheet, but it did useful things and made life easier. VisiCalc was a godsend for Wall Street users. If the Apple II was the father of all personal computers, VisiCalc was the father of all personal productivity software.


The History Of Computers
The 1980's


The Commodore VIC-20

Riding on the success of the PET, Commodore drove its engineers to make computers that anyone could afford. The 6502 based VIC-20, introduced in 1981, was the first color computer that cost under $300. It was also the first computer to sell one million units. VIC-20 production hit 9000 units per day - a rate that was absolutely phenomenal back then. While shortsighted critics kept asking what these machines were good for, Commodore was the first company to introduce millions of people to personal computing.


[Osborne 1 Portable]

The First Portable Computer - The Osborne 1

Adam Osborne sold Osborne Books to McGraw-Hill and started Osborne Computer. Its first product, the 24 pound Osborne 1 "portable", cost $1795. The Osborne 1 was easily identified by its tiny built-in screen. Osborne also started the practice of bundling software with the computer - the Osborne 1 came with nearly $1500 worth of programs! Unfortunately Osborne went bankrupt when it preannounced its next computer while still trying to sell the Osborne 1.


[IBM PC]

The IBM PC

In 1981, the landmark announcement of the IBM PC stunned the computing world. People had always thought of IBM as a high end mainframe player. Even the chairman of IBM is supposed to have looked at the original PC and said that it would never fly - that mainframes would dominate forever.

Despite its weaknesses, IBM did get one critical thing right with the PC - it was based on an open architecture so that it could grow into the future. This strategy, combined with IBM's huge influence and the release of Lotus 1-2-3 a year later, made business people sit up and take notice. The PC and its descendants went on to dominate the computing industry.

The original PC cost $3000, and came with 64 Kb of RAM, a floppy disk drive and monochrome graphics. It also came with DOS, an operating system based on CP/M. In an effort to save time so that it could catch the early personal computer market, IBM chose to license DOS from the then tiny Microsoft instead of writing its own operating system. For many years to come IBM would regret the decision not to write its own PC operating system. Eventually it did do so - OS/2.

The IBM PC was based on Intel's 8088 processor, which was released in 1980. The 8088 was a 16-bit processor which had 8 registers, about 100 instructions, and an unusual (some would say brain damaged) segmented 20-bit memory architecture capable of addressing 1 Mb of memory. It ran at a clock speed of 4.77 MHz in the original IBM PC. The 8088 was actually the second x86 processor. Its predecessor, the 8086 (released in 1978), used 16-bit external busses, whereas the 8088 used 8-bit busses. This made the 8088 about 20% slower than the 8086, but 8-bit busses were critical to keeping down the total system cost.

IBM's decision to use the x86 architecture was widely criticized, and lead to the PC and its descendants facing many problems that other machines didn't face, mainly because of the x86's segmented memory model. So why did IBM chose the 8086 series when alternatives such as the Motorola 68000 were so much better? Apparently IBM's own engineers wanted to use the 68000, but IBM had already obtained the rights to manufacture the 8086 (for use in its Displaywriter intelligent typewriter), in exchange for giving Intel the rights to its bubble memory technology. Another factor was that the 8088 could use existing low cost 8-bit components, whereas 68000 components were more expensive and not widely available at that time. In any case, thanks to the PC's open design, the Intel x86 architecture went on to completely dominate the computing industry - proof that technical superiority sometimes doesn't matter.


Graphical User Interfaces Arrive - The Xerox Alto

In the late 1970's and early 80's, the Xerox Alto started the graphical user interface revolution which would sweep through the computer industry over the following decade. The desk-sized Alto, and its commercialized descendant the Xerox Star, were the first GUI-based computers. Researchers at Xerox PARC (Palo Alto Research Center) developed the basic ideas of a graphical user interface along with all the associated innovations - the mouse, the desktop metaphor, icons, windows, menus etc. Although the ideas in the Xerox Star were revolutionary, it was a huge failure commercially. This was due mainly to the price tag of $50,000.

When Steve Jobs took a tour of Xerox PARC in 1979, he saw the Alto and realized it was the future of computing. He quickly began to work towards bringing the technology to market. Many of the ideas in the Alto showed up two years later in the Apple Lisa, and finally made it to market in the Apple Macintosh. Several Xerox researchers also left to join Apple.


[Commodore 64]

The Commodore 64

In 1982, a year after the huge success of the VIC-20, Commodore introduced the Commodore 64. This was the machine that brought computers to the masses.

The Commodore 64 reached an altogether new level of popularity. A decade later it still held the record as the best-selling single computer model of all time. An estimated 22 million units were sold. That's almost as many as all the Macintosh models put together, and it dwarfs IBM's top-selling systems, the PC and the AT. For the first time ever, millions and millions of people all over the world went and bought a computer - a Commodore 64 - from their local department store!

The Commodore 64 set a number of technical firsts too. It was the first cheap computer to have a whopping 64 Kb of RAM, it was the first personal computer with an audio synthesizer chip, and the portable version, the SX-64 (1983), was the first color portable.

More than that, the Commodore 64 was a fun machine. Although it was only based on the MOS 6510 processor (a slightly modified version of the 6502 used to power the Apple II five years earlier), the 64 had fast color graphics with hardware sprites. Compared to the Apple II's slower color graphics and the IBM PC's monochrome display, it was way ahead. It had enough memory to make really good software, so the 64 software market boomed, especially the games market. And, as possibly its most important feature, the price was right - the Commodore 64 cost around $400.

As with the Apple II, software for the 64 was distributed on audio tapes, or on floppy disks for those who bought the optional floppy disk drive (which cost almost as much as the machine itself). In rare cases plug-in cartridges were also used. Like the VIC-20, the 64 used a TV set as its display device (which is part of the reason it was able to be so cheap).

For most of my generation, myself included, the Commodore 64 was the first computer they ever owned or used.


Lotus 1-2-3

VisiCalc on the Apple II may have sold Wall Street on the idea of electronic spreadsheets, but Lotus 1-2-3 was the spreadsheet that Wall Street adopted. When the IBM PC took over the business world in the early 1980's, Lotus 1-2-3's simple but elegant grid was without question the best spreadsheet available. It added simple chart style graphics and data retrieval functions to the paradigm established by VisiCalc. By the early 1990's, Lotus could brag that Lotus 1-2-3 was the best-selling application of all time. Lotus's period of dominance finally ended when Microsoft Excel came out with a graphical user interface for Microsoft Windows.


Compaq - The First PC Clone

Compaq's portable almost single handedly created the PC clone market. It weighed a ton (20 pounds), but it was the first successful PC clone. Columbia Data Products just preceded Compaq that year with the first true IBM PC clone, but they didn't survive for long. It was Compaq's quickly gained reputation for quality, and its essentially 100% IBM compatibility (reverse engineered), that created the clone market.


Radio Shack's TRS-80 Model 100

Years before mainstream notebook computers, Radio Shack came out with a book sized portable with an unbeatable combination of features, battery life, weight, and price. The $800 TRS-80 Model 100 had an 8-row by 40-column reflective LCD screen, supplied ROM based applications including a text editor and a communications program, and it had a built-in modem, nonvolatile RAM, and a keyboard. Weighing under 4 lb, and with a battery life measured in weeks (it used four AA batteries), the TRS-80 Model 100 became the first popular laptop, especially among journalists. With its battery-backed RAM, the Model 100 was always in standby mode, ready to take notes, write a report, or go on-line. NEC's PC 8201 was essentially the same Kyocera-manufactured system.


dBase

Wayne Ratliff's dBase was originally intended to manage a company football pool, but ended up being the first serious database system for personal computers. dBase II, running on DOS, was a massive success. It provided just the right combination of features for a small business database, plus it was relatively easy to learn, and it ran on cheap PC's. As a result, lots of small businesses which had previously been surviving without a database system decided dBase running on a PC was cheap enough and offered enough benefits to be worth buying. Ashton-Tate acquired dBase from Ratliff when dBase was at its peak (dBase III), but let their users down badly by releasing the bug-ridden dBase IV in 1988. dBase for Windows didn't arrive until 1994, by which time its market share was gone.


[Apple Macintosh]

The Apple Macintosh

In January 1984 the introduction of Apple's Macintosh computer, with its graphical user interface, generated even more excitement than the IBM PC had three years earlier. Apple's R&D people were inspired by the critical ideas developed at Xerox PARC (and practiced on the Apple Lisa) but Apple programmers also added many of their own ideas to create the final polished Macintosh product. It was this polished product that changed the way people used computers.

The Macintosh (lovingly called simply the Mac) was introduced in the famous "1984" TV commercial broadcast during the SuperBowl. It featured a small built-in high resolution monochrome display, the wonderful Macintosh operating system with its graphical user interface, and a klunky looking single button mouse. It sold for $2495. With only 128 Kb of RAM, the Mac was memory starved at first, but later models quickly corrected this.

Apple included several key applications that made the Macintosh immediately useful. MacPaint showed people what a mouse was good for, and MacWrite demonstrated that WYSIWYG (What You See Is What You Get) word processing really worked. The Macintosh redefined what we meant when we said that a program was easy to use. The Mac also had a floppy disk drive that used 3.5" disks, which were physically smaller than their 5.25" PC counterparts, but were sturdier and could hold more data (400k).

A couple of years later, Aldus PageMaker allowed high end desktop publishing to be performed on a Macintosh. PageMaker's paste-up metaphor made sense to people who had worked in traditional design and production departments. A Mac became the tool of choice on which to run a publishing business, and the combination of a Macintosh, PageMaker and the PostScript based Apple LaserWriter laser printer went on to dominate the desktop publishing industry.

The Macintosh was powered by Motorola's 68000 processor, a powerful 32-bit processor which had been around since 1979. The 68000 contained about 60,000 transistors, and had 16 registers and a large instruction set which used 13 addressing modes. Future versions of the 68000 architecture would further extend this so that by the end of the 68000 line over a thousand different combinations of instructions and addressing modes were possible!


Word Perfect

In 1984, Satellite Software International introduced Word Perfect, a powerful new word processor for the IBM PC. Despite having a relatively bland and unfriendly character cell user interface, Word Perfect soon became the dominant word processor for the PC market, especially in the business/secretarial world. The ability to use Word Perfect became an essential skill for most secretaries, and remained so until the early 90's and the introduction of Microsoft Windows. A graphical user interface version of Word Perfect was introduced for Windows, along with versions for the Macintosh and the Amiga (and the Atari ST ?), but Microsoft Word managed to grab most of the GUI word processing market.


[IBM PC/AT]

The IBM PC/AT

Building on the strength of the PC (1981) and PC/XT (1983), the PC/AT was a major increase in performance and storage capacity. Although it looked like the original PC, Intel's fast 80286 processor running at 6 MHz, combined with 16-bit busses, made the AT several times faster than the original PC. AT systems also came with much more RAM, usually 512 or 640 Kb, and new high-density 1.2 Mb floppy disk drives.

Hard disks of up to 20 Mb became available, and you could even install two if you wanted. New 16-bit expansion slots allowed for faster expansion cards but maintained backward compatibility with the old 8-bit cards. The hardware changes meant a new version of DOS (the dreaded 3.0).

The price for an AT with 512 Kb of RAM, a serial/parallel adapter, a high-density floppy drive, and a 20 Mb hard disk was over $5000, but at the time this was less than everyone expected.


[Amiga 500]

The Commodore Amiga

In 1985 the Amiga introduced the world to multimedia. The talented engineers who designed the Amiga (and it wasn't Commodore) happened to hit on a basic configuration that all personal computers would eventually move towards. Unfortunately, the Amiga was so far ahead of its time that almost nobody - including Commodore - really understood what it was. Today it is obvious that the Amiga was the first multimedia computer, but in those days it was viewed largely as a games machine because few people grasped the importance of advanced graphics and sound combined with a multitasking operating system with a graphical user interface.

Like the Macintosh, the Amiga was based on the Motorola 68000 processor. The initial model, the Amiga 1000, had 256 Kb of RAM. It was soon phased out by the lower cost Amiga 500 at the low end (shown in the picture), and the Amiga 2000 at the high end. Both offered 512 Kb of RAM standard, expandable to 1 Mb on the Amiga 500 and a whopping 8 Mb on the Amiga 2000.

Unlike previous personal computers, the Amiga used three custom chips (Agnes, Denise & Paula) to do advanced graphics and sound. The graphics in particular was amazing by the standards of those days. At a time when PC users thought 16 color low resolution EGA was hot stuff, the Amiga could display 4096 colors, could reach the extremely high resolution of 640x400, and had custom chips such as a blitter for accelerated graphics. It even had built-in video outputs for TV's and VCR's (a decade later this was still a pricey extra cost option for most systems). The Amiga's audio system was also impressive. Building on the audio capabilities of the Commodore 64, the Amiga had four voice sampled stereo sound and was the first computer with built-in speech synthesis. Although it only cost $1200, the Amiga did graphics, sound, and video well enough that many broadcast professionals adopted it for special effects. With a small investment, even a home user could do reasonable quality desktop video production (I remember doing quite a few videos myself for high school and university presentations).

The Amiga's operating system, designed by Carl Sassenrath, was just as amazing. From the outset it had preemptive multitasking, a graphical user interface, shared libraries, messaging, scripting, and multiple simultaneous command line consoles. Ten years later, PC and Macintosh users were still waiting for some of those features. Thanks to the custom chips and an efficient operating system, the Amiga even felt fast. The user interface was really quite snappy, much faster than a Macintosh. Five years later, Microsoft Windows running on a much faster 80386 based PC still felt slower.

The Amiga was also the first platform to make major use of emulation of other operating systems. Emulators for the IBM PC and the Apple Macintosh became quite widely used, often in an attempt to make the Amiga more useful for business purposes (it didn't have a lot of business software). The technology of emulation went on to become very important in the next decade.


The X Window System - UNIX gets a GUI

[The X Window System]

The X Window System (known as just X, or sometimes [incorrectly] as X Windows), first appeared in the mid 1980's running on DEC VAX based UNIX workstations. X was developed at MIT (with support from Digital) as part of MIT's Project Athena distributed workstation environment. Many of the core ideas of X (as well as the X name) were derived from an earlier Stanford windowing system named W. Other ideas came from Sun's SunView environment which ran on the 68000 based Sun-3 workstations. The early versions of X were developed primarily by Robert Scheifler, Ron Newman and Jim Gettys. X went on to become the basic graphics system of all the RISC based UNIX workstations.

Aside from providing UNIX with a graphical user interface, X's main contribution to the computing world was the idea of displaying an application remotely over a network (ie: running an application on one machine but displaying its user interface on another, much like telnet already provided for command line UNIX users). The implementation was a client-server approach, where an X window system server ran on the displaying machine, and the client programs communicated with it using a network protocol. The X server and its client programs could be running on the same machine, or on different machines, it didn't matter.

X had an unusual career. In somewhat of an odd decision, its designers decided that X should only provide mechanism, not policy. So X did not provide any particular look an feel, but instead only provided the basic mechanisms on top of which several different user interface styles were later implemented. At least three major user interface look & feel styles were widely used on X - MIT's own Athena style, Sun and AT&T's OpenLook, and OSF's Motif (supported primarily by HP and IBM). Other less significant styles included Digital's DECwindows, Silicon Graphics' 4Sight and several public domain styles. None of the styles interacted particularly well with the others, and as a result X was plagued by inconsistencies between applications.


[Atari ST]

The Atari ST

Known primarily as a maker of games machines, the Atari ST was Atari's first major foray into the world of personal computing. Like the Macintosh and Amiga, the Atari ST was based on the Motorola 68000 processor. It offered medium resolution color graphics and high quality stereo sound, and its GEM operating system featured a graphical user interface.

Unfortunately, for mainstream business uses the ST couldn't compete with the PC and Macintosh, and for graphics and games it couldn't compete with the Amiga. As a result, the ST struggled to find its place in the market. Eventually it managed to carve out a niche in the music and audio editing market, where many music professionals used it as an advanced sound mixing and sound effects machine.


MIPS - The First Commercial RISC

The MIPS R2000, introduced in June 1986, was [arguably] the first commercial RISC processor. It was a descendant of the Stanford MIPS project led by John Hennessy, one of the three pioneering RISC research projects of the early 1980's. In stark contrast to the complicated CISC architectures of the 1980's, the MIPS architecture only had about 50 instructions and was a load store architecture with just a single addressing mode. Instead of achieving high performance through complexity, the MIPS design had 64 registers (32 int + 32 fp) and used an efficient pipelined design to achieve almost one instruction per cycle. The original R2000 contained just 110,000 transistors (compared to almost 300,000 for the Intel 386), ran at 12 MHz and clocked in at an impressive 9 MIPS (six times the speed of the 386). It powered the first RISC based workstations - Digital's DECstations.

MIPS stood for Microprocessor without Interlocked Pipeline Stages. In somewhat of an irony, however, pipeline interlocks had to be added back into the architecture five years later (1991) for the third generation MIPS processor, the R4000 (actually, even the R2000 had interlocks for the HI/LO register pair used by the multi-cycle multiply and divide instructions).


The Intel 80386 Processor - x86 goes 32-bit

The 80386 heralded the beginning of a new age for the IBM PC. The 386 was the first 32-bit x86 processor. As such, it was capable of breaking the 640 Kb memory barrier and running software written for graphical user interfaces. The 386 introduced a 32-bit architecture while maintaining full backward compatibility with earlier x86 processors. This was accomplished by using two operating modes: "real" mode, which mirrored the segmented memory of the older x86's, and "protected" mode which took full advantage of the 386's 32-bit enhancements.

The 386 began shipping in August 1986, but unfortunately it was several years before PC operating systems could make use of its 32-bit capabilities. IBM's OS/2 and Microsoft's Windows '95 were really the first 32-bit mainstream PC operating systems, and even on them most applications were still 16-bit!


Compaq's Deskpro 386

While IBM was busy developing its proprietary MicroChannel based PS/2 systems, clone vendors ALR and Compaq grabbed control of the x86 market by introducing the first 386-based PC's, the Access 386 and the Deskpro 386, just a couple of months after Intel began shipping the 80386 processor. This marked the end of IBM's dominance of the IBM PC market. Both 386 clone systems maintained backward compatibility with the 286-based PC/AT. Compaq's Deskpro 386 had a further performance innovation in its bus architecture - it split the x86 external bus into two separate buses: a fast local bus to support memory chips fast enough for a 16 MHz 386, and a slower I/O bus that supported existing expansion cards.


[SPARC]

The Sun SPARC Architecture

In July 1987, Sun announced a RISC architecture called SPARC, which stood for Scalable Processor ARChitecture. Like MIPS, SPARC was the descendant of one of the three pioneering RISC projects, in this case the Berkeley RISC project led by David Patterson. Like MIPS, SPARC was a sleek pipelined design - a radical departure from the complex CISC architectures which held sway at the time. It had about 60 instructions and was a load store architecture with two addressing modes. The instruction set didn't even include integer multiply or divide! SPARC also had one interesting feature which even the MIPS architecture did not have - register windows - which allowed for up to a whopping 520 integer registers (the first implementation had 120), 32 of which were accessible at any given instant. There were also 32 floating point registers.

From the beginning, SPARC was an open RISC architecture (ie: a specification to which anyone could build compatible chips). The idea was to make the architecture open to encourage multiple sourcing and lively competition which would hopefully spur performance and spread the SPARC standard far and wide. This strategy worked well, and by 1995 over a dozen different processors had been built implementing the SPARC architecture, outstripping even the x86 line in terms of the number of different implementations. Holding true to its name, the SPARC architecture scaled very well, ranging from low power notebooks and portables to huge million dollar Cray supercomputers. SPARC based systems went on to dominate the UNIX workstation and server markets.

Rapid time to market was an important goal of the SPARC architecture, so Sun selected a gate-array technology for the first SPARC implementation. It was a 20,000 gate chip running at 16 MHz, with a performance of about 10 MIPS. Fujitsu delivered the first chips in April 1986, a year before the SPARC architecture was officially announced! Sun waited until July 1987 to announce the SPARC architecture so that it could announce the first complete SPARC systems at the same time - the Sun-4/200 family.


[PA-RISC]

HP's Precision Architecture

Hewlett-Packard's Precision Architecture (PA-RISC) was another early commercial RISC. It was a relatively conservative design, with a load store architecture and 64 registers (32 int + 32 fp). PA-RISC had an unusually large instruction set for a RISC, partly because the initial design took place before the RISC philosophy was popular. Despite this, it was a simple design and the first implementation only had 115,000 transistors. PA-RISC processors were used in HP's UNIX workstations, which became quite popular and reached a 25% share of the workstation market.


The C++ Programming Language

Back at AT&T Bell Labs, things hadn't stood still. Several new versions of UNIX had been written, and the C language had been undergoing ANSI standardization. But by far the most interesting thing to happen at Bell Labs in the late 1980's was the emergence of the C++ language as the refined, object oriented successor to C.

Early versions of the C++ language, collectively known as C with Classes, had been in use within Bell Labs since as far back as 1980. The language was originally invented because its creator, Bjarne Stroustrup, wanted to write some event driven simulations for which the Simula language would have been ideal except for speed. So he wrote a front end preprocessor which allowed Simula style classes to be implemented efficiently in C. This was similar in concept to how RATFOR added structured programming to FORTAN, but in this case Stroustrup was adding object oriented features to C. Stroustrup's CFront preprocessor gradually evolved into the first C++ compiler.

C++ was first used outside Stroustrup's research group in 1983, but at that time it lacked many of its final features. Between 1983 and 1991, the language acquired operator overloading, multiple inheritance, templates and exception handling (among other things). Most of these features were driven by experience, and resulted in an efficient and practical, down to earth object oriented language (unlike other OO languages which tended to be slow and were based on abstract methodologies rather than practical experience). Its efficiency, backward compatibility with C, and down to earth approach resulted in C++ gradually succeeding C as the dominant programming language of the computer industry.

The name C++ came from the C increment syntax. Hence C++ was an incremented C (whereas C+ would have been a syntax error).


A Color Mac - The Apple Macintosh II

Although there had been several new Macintosh models since the original Macintosh, the Macintosh II, released in March 1987, was a real leap. It was the first color Macintosh computer. In the normal configuration the color was 8 bitplanes deep - able to display 256 colors at a time from a palette of 16 million! For the power graphics user there was even a 24-bit graphics card available as an expensive extra cost option. The awesome graphics capabilities of the Mac II inspired Adobe to produce a feature packed high end photo editing package called Photoshop, which went on to dominate the photo editing market.

In moving to color Apple also did away with the built-in 9 inch screen, moving instead to a more conventional (and expandable) desktop chassis with a separate 14" color monitor. The screen resolution was increased to 640x480, and RAM was expandable up to a whopping 68 Mb. The only thing missing was a custom chip for graphics - the CPU still did all the drawing on the Mac II, which made the graphics a little slow.

The processor was upgraded to a powerful Motorola 68020 with a 68881 FPU, and a new version of the Macintosh operating system was also released which incorporated color capabilities and cooperative multitasking. The Mac II was quite expensive, costing $5498 in a standard configuration with 1 Mb of RAM and a 40 Mb hard disk. Nevertheless, the verdict was unanimous. This was the ultimate Mac - a colorful, multitasking, all-singing, all-dancing high performance machine. Unfortunately, most applications still treated the machine as if it were black and white, at least for a while.


The Acorn Archimedes

The Archimedes (lovingly called the Arc) was the first RISC based personal computer. Introduced in 1987, the Archimedes was based on Acorn's ARM architecture (ARM stood for Advanced RISC Machine). The ARM was a simple RISC architecture with 16 registers and no floating point support. The initial ARM implementations concentrated on low cost by using a short 3-stage pipeline. A unique feature of the ARM was that every instruction had a set of condition codes which indicated under what conditions it should be executed. This enabled simple conditional execution, which eliminated many branches and improved performance.

For a home computer, the Archimedes packed quite a punch. The original Archimedes A305 only had 512 Kb of RAM and monochrome graphics, but its 4 MIPS ARM2 processor offered much better performance than a 68020 based Macintosh II or an 80386 based PC. Later models used the even faster ARM3 processor and added fast Amiga-like color graphics, more memory (up to 4 Mb), and a floating point co-processor. The Archimedes' RISC-OS operating system featured multitasking and a graphical user interface.

The Archimedes sold quite well in the British educational market, which had been Acorn's primary market for its earlier 6502 based BBC machines. Unfortunately, the Archimedes was not a success in the wider marketplace, due mainly to the lower price and larger software market of the Amiga (with which it directly competed), and the Microsoft Windows phenomenon of the early 1990's.


[HyperCard]

HyperCard

In August 1987, an Apple engineer named Bill Atkinson (also the developer of MacPaint) introduced a new type of software development tool - HyperCard. HyperCard was unlike any previous software development tool in two key respects: it was interactive rather than language based, and it was geared towards the construction of user interfaces rather than the processing of data. As such, HyperCard made an ideal tool for rapid prototyping and the development of in-house applications.

HyperCard was loosely based on the hypertext idea of links between pages (in this case screens called cards). Building HyperCard applications (called stacks) required almost no programming skills, so even end users could produce useful and interesting applications. A simple language called HyperTalk was available "behind the scenes" to build more complex stacks. HyperCard was a godsend to researchers in the field of Human Computer Interfaces, since it made experimentation with new user interface styles exceptionally quick and easy.

Apple distributed HyperCard free of charge with every Macintosh sold (until 1992, when it became an extra cost product). Similar tools became available for other platforms over the following decade.


[SPARCstation 1]

The Sun SPARCstation

The Sun SPARCstation 1 was the machine that brought RISC performance to the masses. It wasn't the first RISC workstation. It wasn't even the first SPARC based Sun system. But the SPARCstation 1 set a new standard for price/performance. It churned out a whopping 12.5 MIPS at a starting price of only $8995 - about the cost of a fully configured Macintosh.

The SPARCstation 1 was introduced in 1989. It was based on a SPARC processor running at 20 MHz, and a typical configuration included 16 Mb of RAM, a couple of hundred Mb of hard disk, and a fast and extremely high resolution (1152x900) color graphics system displayed on a huge 19" color monitor. The SPARCstation 1 also had an optical mouse rather than a mechanical one, and built-in ethernet networking. The operating system was SunOS, Sun's version of BSD UNIX. This was later replaced with Solaris, Sun's version of SVR4 UNIX.

Sun sold lots of SPARCstation systems and made the words SPARCstation and workstation synonymous in many people's minds. Engineers and scientists loved them and looked no further for a very long time. A whole range of bigger and better SPARCstation models, including clones, appeared on the market over the next few years, making SPARC the second major platform to use cloning to keep prices down (the IBM PC was the first, of course).


Silicon Graphics IRIS - The Birth of 3D Graphics

[Personal IRIS]

In 1989, Silicon Graphics introduced what was [arguably] the first 3D graphics workstation, the IRIS 4D Superworkstation. It was based on the MIPS RISC architecture, but its real innovation was its high end 3D graphics system, including a 24-bit double buffered extremely high resolution (1280x1024) display, with Z buffering and Gouraud shading in hardware!

The high end GTX graphics system had 5 custom 3D geometry processors and 5 pixel rendering processors which ran in parallel. It could render over 100,000 3D shaded triangles per second, which made it fast enough for interactive 3D solid modeling. And all the 3D graphics hardware was at a programmers disposal through the IRIS's wonderful GL 3D graphics library.

The only problem was the price - the IRIS 4D was really, really expensive! Responding to requests for a lower cost version, Silicon Graphics brought out a Personal IRIS which had cut-down features but was still capable of interactive 3D wireframe and limited solids modeling. The Personal IRIS still wasn't cheap, but for many graphics professionals it was reachable.

The IRIS series and its descendants, together with the GL graphics library and its OpenGL descendant, went on to dominate the 3D graphics market.


[The NeXTstep User Interface]

The NeXT

The NeXT was Steve Jobs' first major computer after leaving Apple. It was released in late 1989, and featured a Motorola 68040 processor (early NeXT's used a 68030), greyscale graphics (color was added later), 8 Mb of RAM, a built-in DSP (digital signal processor) and the first commercial magneto-optical drive (256 Mb capacity). Its NeXTstep operating system was a version of UNIX with a friendly and consistent GUI wrapped around it. The NeXT cost just under $10,000.

Unfortunately, the NeXT had a few critical flaws. The primary programming style for the NeXT was correctly chosen to be object oriented, but the primary language chosen for the machine was Objective-C, a hybrid mix of C and SmallTalk. It should have been C++. Objective-C went on to become a dismal failure in terms of wide spread use, whereas C++ went on to supersede C as the dominant language for the entire computing industry. The NeXT was also based on the aging Motorola 68000 CISC architecture. It should have been based on a RISC architecture such as MIPS or SPARC. And finally, the user interface relied heavily on PostScript for its text rendering, which made it slow.

These flaws, plus the fact that it was priced slightly too high, meant that the NeXT never really caught on. The NeXTstep operating system was eventually ported to other platforms including x86 and SPARC, but it still failed to capture any significant market share. Nevertheless, it served as an inspiration for future workstations. It took several years before mainstream UNIX workstations came with a decent GUI, for example.


The History Of Computers
The 1990's


[RS/6000]

The IBM RS/6000

In the early 1980's IBM had been right at the forefront of RISC research with the IBM 801 project led by John Cocke, but it took IBM a long time to produce a successful RISC product. IBM's first commercial RISC, the RT PC, was a flop, but IBM kept trying. Its second commercial RISC product, the RISC System 6000 introduced in 1990, was a good one. The RS/6000 workstation's multi chip POWER1 processor was the first superscalar RISC processor, and racked up speed records in many areas. With over 100 instructions, the POWER architecture barely qualified as a RISC (it even included string operations), but its price was very competitive. IBM used its influence to push for third party software support, and as a result many CAD and scientific applications were ported to RS/6000 workstations running AIX, IBM's version of UNIX.


[Windows 3.0]

Microsoft Windows 3

As its name implies, Windows 3 was not the first release of Microsoft's Windows graphical user interface for PC's. Windows had originally been released in 1985. However, in the past Windows had looked ugly, run slowly and had very little support from third party software developers.

Windows 3 was different. It was still 16-bit, but the user interface was completely revamped to mimic the look and feel of IBM's as-yet unreleased OS/2 with its 3D sculpted buttons. The 640 Kb memory limit was broken (sort of), resulting in better performance and finally giving PC's the chance to run large graphical applications. Multiple programs could be run simultaneously, and although this wasn't true preemptive multitasking it was a big step forward. Virtual memory was also provided.

Most important of all, however, was that at its big launch in May 1990, Microsoft was able to parade an impressive lineup of major software vendors with applications which ran under Windows 3. Among these were versions of the Microsoft Word word processor and Microsoft Excel spreadsheet, which went on to dominate the personal word processing and spreadsheet markets on both Microsoft Windows and the Apple Macintosh.

The bottom line was that a PC running Windows 3 was now almost as easy to use as an Apple Macintosh. Because of this, Windows swept through the PC world like wildfire, and within a year nearly everyone was running it on their PC's. In 1992, version 3.1 was released, which added TrueType fonts and provided better stability, further narrowing the gap between Windows and the Mac. This was followed by Windows 3.1.1 (marketed as Windows for Workgroups), which added networking support and closed the gap even more.


Apple Sues Microsoft

Apple actually began court proceedings against Microsoft in 1988, when Microsoft released Windows 2. However it wasn't until Windows 3 was released, and Apple immediately expanded its claim to include Windows 3, that the media began to pay major attention to this case. In essence, Apple was arguing that Microsoft Windows breached copyright by being too similar to the Macintosh user interface.

The case ended up taking many years and going through several appeals. The final decision, announced on ???? (early 95?), was that copyright had not been breached. Some people saw this as a good decision because it promoted competition, while others saw it as a terrible decision because it reduced the incentive to develop new innovative technology. This fundamental question is still under debate today, and probably will be forever.


The AMD 386DX

The AMD 386 was the first successful x86 processor that wasn't built by Intel, and as such it started an x86 processor price war. When Intel's original 16 MHz 386 was introduced in 1985, it cost $299. Five years later, it was still commanding the relatively high price of $171, and the 33 MHz version fetched $214. AMD's 40 MHz 386DX was released in March 1991 at $281, but within a year its price had plunged 50% to $140. The prices of PCs followed the chip prices down, and fell by as much as $1000. As a result, the market for PC's running Windows expanded by over 33%.


[Macintosh System 7]

Apple Macintosh System 7

Although not as much of a leap as users has hoped for, the System 7 release of Apple's Macintosh operating system in May 1991 introduced a few new features. Cooperative multitasking had been available through the optional MultiFinder of System 6, but it was made a standard feature of System 7. Virtual memory was also introduced, as was the innovative balloon help system. But the biggest improvement was the TrueType scalable outline fonts. Even though they didn't reach Microsoft Windows until version 3.1 (1992), TrueType fonts almost single handedly created the low cost inkjet printer market. Suddenly PostScript was no longer necessary for high quality printing, ending the domination of laser printers.


[Virtual Reality Devices]

Virtual Reality

In the early 1990's, the availability of high powered 3D graphics workstations from vendors such as Silicon Graphics allowed all sorts of interesting uses for interactive 3D graphics to be developed. Computer aided design (CAD) and 3D animation for special effects were two relatively obvious uses, but a more interesting and exciting use was the concept of virtual reality (VR).

Virtual reality allowed a user to be placed within a virtual world which he or she could explore from an arbitrary point of view. Early uses of VR included walkthroughs of buildings which had not yet been constructed, simulations of environments which were too expensive or too dangerous to perform normal training within (eg: outer space), and multi player virtual reality games. Other more sophisticated uses were found within existing scientific visualization fields such as chemical engineering. In a sense, virtual reality had actually been around for ages in the form of flight simulation.

To make the virtual world convincing to the user, a number of special devices were invented. The most important of these was the stereoscopic head mounted display, which tracked the position and orientation of the user's head and displayed a different image to each eye to trick the user's binocular vision into perceiving depth in the 3D scene. The head mounted display also fed separate audio signals to each ear to produce stereo sound. When driven by appropriate software running on a computer with fast enough 3D graphics, the user could become totally immersed in a realistic virtual environment.

To interact with the virtual environment, the user wore a dataglove on one, or sometimes both, hand(s). This allowed the position of the user's hand(s) to be tracked and drawn within the virtual environment, complete with shape and gesture information. Different gestures were used to perform actions and interact with objects in the virtual world. Some lower cost VR systems used a hand held device with buttons, rather than a glove. Other high end VR systems added simple voice command recognition. One extremely expensive glove device even provided a degree of tactile feedback (touch).

The cost of a head mounted display and a dataglove was very high, so low cost systems gave up on the idea of true immersive virtual reality and concentrated on giving the perception of depth to 3D scenes, usually by synchronizing the display of a normal monitor with a pair of shutter glasses which blanked out each eye in succession.


[Alpha 21064]

The Alpha Architecture - 64-bit Arrives

Digital's Alpha architecture, announced in 1992, was the first true 64-bit architecture. It was designed for a 15 to 25 year lifespan, and aimed to be the major replacement for Digital's VAX architecture, which had dominated the minicomputer and server markets during the 1980's. Alpha was a clean, pure RISC architecture designed to accommodate a 1000 fold increase in performance over its lifetime (10 fold by clock rate, 10 fold by superscalar execution, and 10 fold by multiprocessing). Learning from experience, DEC engineers and architects led by Richard Sites and Richard Witek carefully analyzed and avoided any obvious limits to future performance when they designed the Alpha. And since it was 64-bit from the beginning, kludgy 64-bit extensions were unnecessary (unlike MIPS, SPARC and the PowerPC).

At the launch of the Alpha architecture DEC also announced the first Alpha implementation, the 21064. It immediately jumped to the top of the performance table, clocking in at almost twice the performance of its competitors (for the 200 MHz version). It was a dual issue superscalar processor that contained just 1.68 million transistors, which wasn't many compared to 3.1 million for Sun's SuperSPARC, 2.8 million for the PowerPC 601, and 23 million for the POWER2. In many ways the Alpha 21064 was the antithesis of IBM's POWER designs, which achieved high performance through instruction level parallelism at the expense of a large transistor count and a slower clock. The Alpha 21064 concentrated on the original RISC idea of simplicity and a higher clock rate, but that also had its drawback - very high power consumption.

One of Alpha's major goals was to replace the VAX architecture. To make the VAX to Alpha transition easier for existing VAX customers, DEC provided a translator which converted binary VAX programs into binary Alpha programs. The resulting Alpha programs ran more slowly than if they had been recompiled for an Alpha from source code, but much faster than if they were emulated at runtime using standard interpretation. The Alpha was so much faster than the VAX that there was no performance loss, and usually a substantial performance gain, for translated programs. The same approach was also available for users of the MIPS based DECstations. Three years later, a SPARC to Alpha version of the translator was built in an attempt to encourage the large installed base of SPARCstation users to move to the Alpha platform.


Linux - A Free UNIX

In 1992, a talented programmer named Linus Torvalds took a small educational version of UNIX called Minix and rewrote and extended it. By mid 1993, Linux had completely dropped its Minix roots and was becoming quite a usable version of UNIX. It was adopted with great enthusiasm by other programmers on the Internet, and began to spread like wildfire. It soon became the fastest growing version of UNIX, mainly because it was free. Linux ran primarily on x86 based PC's, and it actually ran pretty well even on a slow 386 with 4 Mb of RAM and a 40 Mb hard disk.

Linus and his followers proudly described Linux as a "hacker's system" (ie: a programmer's system) because it relied heavily on freely available software which had been written by other programmers. Its graphics system was the X Window System, which was freely available from MIT. For its GUI it used a collection of freely available window managers and other GUI components, as well as the Athena and OpenLook styles, which were also freely available. Because of its popularity, some companies even began to sell versions of Motif for Linux.

Most of the other programs which people actually used (the shells, the compilers, the utility commands etc) came from the GNU Project - a free software project started by Richard Stallman in the 1980's. Stallman was a talented programmer, but he was also a little unusual. He passionately believed that all software should be free and should come with source code so that other programmers could extend it, and that computing professionals should only make money through consulting. The ultimate goal of the GNU Project was to create a completely free UNIX-like operating system called GNU (which stood for GNU's Not Unix - a recursive definition!).


[PowerPC]

The PowerPC Architecture

1993 saw the introduction of the PowerPC RISC architecture, a modified version of IBM's successful POWER architecture used in its RS/6000 workstations. PowerPC was the result of an alliance between Apple (who recognized the need to drop the aging 68000 architecture in favor of a RISC), IBM (who were dissatisfied with the PC market after loosing control of it to Microsoft and clone vendors), and Motorola (who had manufactured the 68000 series and wanted to keep making chips for Apple's machines). The AIM (Apple, IBM, Motorola) alliance, which started in 1991, was something that the computing industry couldn't believe was going to work. After all, IBM and Apple had been bitter enemies just a few years earlier.

Unlike the expensive multi chip POWER processors, the PowerPC architecture was aimed at low cost single chip microprocessor implementations. As a result, some of the excess baggage of POWER was eliminated or replaced. At the same time, to better accommodate the future, optional 64-bit extensions were added. Unfortunately, this meant that the PowerPC architecture was not totally forward or backward compatible with POWER, causing headaches for compiler writers.

The first PowerPC processor, the PowerPC 601, was released in 1993, two years after the announcement of the PowerPC architecture. It was a three issue superscalar processor which offered high performance at a low cost. It soon made its way into the lower half of IBM's RS/6000 workstation line. A year later it also appeared in Apple's first Power Macintosh computers.

Late 1993 also saw IBM's POWER2 processor succeed the POWER1 as the processor in the high end RS/6000 machines. POWER2 was an expensive and very aggressively superscalar processor. This resulted in very high complexity and an impressive 23 million transistors spread over eight chips! The complexity was well targeted and was quite effective, but it also limited the clock rate - an interesting tradeoff considering that the highly parallel 71.5 MHz POWER2 was faster than the 200 MHz DEC Alpha 21064 (but the POWER2 was also much more expensive).


[Newton]

The Apple Newton

The Apple Newton, released in August 1993, was the first popular hand held personal digital assistant (PDA). The Newton's primary input device was a stylus pen, and it relied heavily on printed handwriting recognition and pen based navigation for its user interface. It was aimed squarely at mobile business professionals, and had a built-in notepad, calculator, to-do list, calendar and address book for organizing personal and business affairs. Using an optional wired or wireless modem, it could send faxes or hook up to the Internet to send and receive email. It even had a version of the popular Quicken financial software to help organize personal and business expenses.

Although it weighed less than 1 lb and was only the size of a small notepad, the Newton had roughly the processing power of an Intel 80486. It used a 20 MHz Acorn ARM RISC processor because its low cost, high speed and low power consumption made it ideal for the Newton's relatively demanding handwriting recognition based user interface. The Newton also had a 366x240 pixel reflective LCD display, 640 Kb of RAM, 1 Mb of non-volatile RAM, and 4 Mb of ROM containing its pen based operating system and built-in applications. The Newton communicated with other Newtons and normal desktop computers through an infra-red signaling system, and used credit card sized plug-in cards for expansion devices.

The Newton cost $699, and 50,000 units were sold in the first 10 weeks. Unfortunately, the first generation of Newtons had very poor handwriting recognition, and were received very poorly.


[CDE]

A Standard UNIX - COSE & CDE

Since the great divide of the early eighties, UNIX had suffered from a severe lack of standardization between platforms. The great divide had split UNIX into two distinct families - BSD from Berkeley and System V from AT&T. Although they had basically the same functionality, there were sufficient differences between BSD and System V that virtually every application of any substance needed to be modified to work on "the other" system. On top of this, the X Window System had even bigger problems stemming from the different GUI's available - Athena, OpenLook, Motif, DECwindows etc.

Product differentiation forces within the UNIX market had meant that practically no two systems were alike. Sun used BSD and OpenLook, HP used System V and Motif, Digital used BSD and DECwindows, and so on. All up, the UNIX market was a mess and end users were frustrated by the differences between "UNIX" systems. However, in 1993 the looming threat of Microsoft's Windows NT forced the UNIX vendors to finally see the light. Convinced that a divided UNIX market would never fight off Windows NT, but a united one might, they quickly agreed to standardize.

A consortium consisting of Sun, HP, IBM, Digital, AT&T Bell Labs, Novell and SCO all agreed on a single Common Operating Software Environment (COSE) and a Common Desktop Environment (CDE). The basis of COSE was Spec-1170, a UNIX API specification based on AT&T's UNIX System V Release 4 (SVR4), which combined most of the BSD and System V functionality into a single version of UNIX. CDE sat on top of this, and consisted of the X Window System with the Motif user interface and a desktop manager based on HP's Visual User Environment (VUE). Finally, Sun's desktop utilities were converted to Motif and became the utilities supplied with CDE.

Over the next couple of years Sun moved to Solaris (SVR4) and gradually dropped OpenLook. Similarly, DEC moved to OSF/1 and gradually dropped DECwindows. Other vendors acted likewise, and by the end of 1995 all the major UNIX variants were COSE/CDE systems, with the exception of Silicon Graphics and Linux. Silicon Graphics adopted SVR4 and Motif, but used its own desktop manager rather than CDE's VUE based one. Linux, of course, couldn't adopt Motif because Motif was not free. But even Linux was basically SVR4, and users could buy Motif if they wanted it.


[Pentium]

The Intel Pentium

The Intel Pentium processor began shipping in late 1993, and swept through the PC industry faster than any of Intel's previous processors. Although Intel's 80486 (1989) included a built-in FPU and was much faster than the 80386, it was the Pentium that introduced the next leap forward in the x86 microarchitecture: superscalar pipelines. Skeptics said a CISC architecture couldn't do it, but the Pentium proved otherwise. The Pentium contained 3.1 million transistors and initially ran at 60 MHz. It was called the Pentium rather than the 80586 to avoid confusion with the copycat names of x86 processors from AMD and NexGen (such as the AMD386 and Nx586).

Although it dominated the PC world, the Pentium had a checkered career. In November 1994, a researcher at an educational institution found a serious error in the precision of the floating point divide operation of the Pentium. This meant that all previous floating point calculations involving division were now suspect if they had been performed on a Pentium. Even worse, it emerged that Intel had known about the flaw for over a year, but had decided to say nothing. In the end, a month of intense market pressure forced Intel to replace a lot of the installed base of Pentiums with a free upgrade to a reimplementation of the processor which didn't have the flaw.

In 1994 and 1995 a typical Pentium based PC had a Pentium processor running at between 60 and 120 MHz, 4 to 16 Mb of RAM, a couple of hundred Mb of disk space, 8-bit 640x480 "SuperVGA" graphics, a 14" color monitor, a CD-ROM drive, and ran Windows 3.1. It typically cost around $1800 to $2500, depending on the specific configuration.


[PowerMac 6100]

Apple's Power Macintosh

Apple's Power Macintosh computers, introduced in March 1994, marked the successful transition of RISC into the mainstream personal computer market.

Unlike the painful CISC to RISC transitions which had occurred in the UNIX market, Apple handled the transition from its existing base of 68000 based Macintoshes to the new PowerPC based ones beautifully. To provide complete backward compatibility with existing 68000 software, the Macintosh operating system was augmented to emulate the 68040 processor. This emulation required no user intervention, and worked even for many device drivers. And because the new PowerPC processors were so fast, the emulation overhead was tolerable. Naturally, native PowerPC applications ran faster than emulated 68000 ones, but there was only a slight loss of performance for 68000 based software when running on a PowerMac, compared to a 68040 based Macintosh. The concept worked so well that within the first year over a million PowerMac's had been sold, rocketing the PowerPC up to the top of the RISC architectures in terms of importance.

The initial PowerMac 6100 had a 60 MHz PowerPC 601 processor, 8 Mb of RAM, 16-bit 640x480 graphics, 16-bit stereo sound, a 250 Mb hard disk, a CD-ROM drive and built-in ethernet. Its 14" color monitor was a unique design with the speakers mounted in an angled panel below the display. In this configuration, the PowerMac 6100 cost $2289. The more expensive 7100 and 8100 models had faster processors (66 and 80 MHz) and were more expandable.

Efficient emulation of the 68040 was absolutely critical to the success of the PowerMac, since even some parts of the Macintosh operating system were still 68000 code when the PowerMac was first released (much of the code had been written in assembly language back in the mid 1980's). About a year after the introduction of the PowerMac, a start-up company called Connectix announced SpeedDoubler - a much faster 68040 emulator for the PowerMac based on dynamic compilation. Users were quick to adopt it. Recognizing the importance of emulation performance, Apple soon changed their 68040 emulator to use dynamic compilation.


[Netscape]

The World Wide Web

Although the Internet had been around for many years, it was the introduction of the World Wide Web which made the Internet popular. The Web offered a simple, friendly, graphical way of browsing for information or entertainment. Millions of electronic storefronts suddenly sprung up for the tens of millions of Web "surfers" to look at. Suddenly the Internet boomed, and just about everyone who owned a computer wanted to connect. Modem sales skyrocketed.

The World Wide Web was based on the hypertext idea. Information was stored as formatted hypertext in the HTML format, which browser tools such as Mosaic and Netscape could fetch from across the Internet and display to the user. HTML was soon augmented to allow pictures, then video and audio, and even 3D graphics and virtual reality. Eventually, the Web was even able to achieve intelligent interactive content via Sun's Java language.

End users loved the Web because the user interface was a simple point and click style (just click on the hypertext links). As such, it was much easier than ftp and telnet. The user base grew very rapidly, doubling every few weeks. Internet Cafe's appeared in shopping malls so that even people without a computer could surf the web. Then the media got on board. Once that happened there was no stopping it.

The current opinion is that the Web has started the next big boom in the computer industry - the wide spread use of networking for both entertainment and commerce.


[OS/2 Warp]

IBM's OS/2 Warp

After the first two largely failed launches of IBM's OS/2 operating system for PC's, the third attempt, released in October 1994 and marketed as OS/2 Warp, finally put OS/2 on the map. IBM started a big new marketing push (small planet), which concentrated on OS/2's networking support and Internet access tools, plus its technical advantages of multitasking, true 32-bit, and backward compatibility with applications written for Microsoft Windows 3.x. In the first five months IBM sold 1.7 million copies of OS/2 Warp, firmly establishing it as the second most popular PC operating system behind Microsoft Windows.


[PowerComputing 100]

Macintosh Clones, Finally

In April 1995, six months after Apple agreed to license its Macintosh operating system for cloning purposes, the first Macintosh clones appeared. Onlookers from the PC world could have been forgiven for thinking that the new clone makers didn't understand what cloning was all about - price. Radius's VideoVision workstation was a souped up PowerMac 8100 aimed at the high end video editing market, and had a price tag of almost $30,000. At the other extreme, Cutting Edge's Quotro 850 was based on the out of date Motorola 68040 processor. Only Power Computing's offerings fit the traditional image of clones: cheaper machines using off the shelf components, with flexible configurations and quick delivery.

Power Computing's first two models, the Power80 and Power100, were roughly equivalent to the PowerMac 7100 and 8100, but they used low cost PC components and enclosures wherever possible to keep the cost down, including using a standard PC monitor. As a result, they looked like IBM PC's from the outside. The Power 100 model was priced at $3,349, about $1000 cheaper than a similarly equipped PowerMac 8100.


[Windows '95]

Microsoft Windows '95

After at least eighteen months of pre-release hype, Microsoft finally released Windows '95 on August 24th 1995. The associated marketing campaign was nothing short of amazing. It was a massive global multimedia marketing hype-fest including TV, radio, newspapers, magazines, billboards and just about everything else. Adds for Windows '95 were everywhere! Such a barrage had hardly ever been seen in the history of marketing.

Technically, Windows '95 added several important pieces of functionality to the Windows environment. The filesystem could now support filenames longer than 8 characters, because Windows '95 was a free standing operating system which didn't sit on top of DOS (unlike Windows 3.x). Nested folders were also supported (finally!). Windows '95 also had full networking support, including tools for accessing the Internet and Microsoft's own proprietary network (MSN, the Microsoft Network). The native Windows '95 API and most of the operating system was 32-bit, which resulted in improved performance. True preemptive multitasking was provided for native 32-bit Windows '95 applications (but not for 16-bit Windows 3.x applications). And finally, the look of the user interface components was altered to make them look more stylish.

All up, Windows '95 was probably closer to Windows NT than to Windows 3.1, marking Microsoft's clear intention for NT to be the successor operating system once the x86 architecture began to fade from the scene.


[Toy Story]

Toy Story

Toy story, released in late 1995 by PIXAR, was the first feature film to have been fully generated using 3D computer graphics. It marked the coming of age of 3D graphics, which had previously only been used for short special effect sequences lasting a few seconds within traditional films using actors, cameras etc.

The animators who created Toy Story used high end Silicon Graphics workstations to prepare the animations. More than 400 3D models and 2000 texture maps were used, and the two main characters (Woody and Buzz) each had over 700 animation controls, including 212 on Woody's face and 58 on Woody's mouth alone. The modeling and animation preparation took over ten man years to complete. No motion capture was used in the entire film - everything was animated by hand. The final frames were then rendered by a "rendering farm" of 117 multiprocessor Sun SPARCstation 20's. It took 800,000 machine hours to render the 114,200 frames of the 79 minute film.