the Invention of the Century
"In the summer of 1969, Busicom, a now defunct Japanese calculator manufacturer, asked Intel to develop a set of chips for a new line of electronic programmable calculators. (The IC's extraordinary capabilities were leading to a blurring on the line between calculators and computers.) Busicom's engineers had worked up a preliminary design that called for twelve logic and memory chips, with three to five transistors each. By varying the Read Only Memories (ROMs), Busicom planned to offer calculators with different capabilities and options. One model, for example, contained a built in printer. The company's plans were quite ambitious ; at the time, most calculators contained six chips of six hundred to a thousand transistors each. But Intel had recently developed a technique for making two thousand transistor chips, and Busicom hoped that the firm could make even more sophisticated ICs."
"Intel assigned the Busicom job to Marican E Hoff, a thirty two year old engineer with a BS in engineering from Rensselaer polytechnic Institute in Troy, New York, and a Doctorate from Stanford. A natural engineer, with the thoughtful manner of a Professor and the caution of a corporate executive, Hoff -- Ted to his friends -- had a knack for spotting new solutions to technical problems..."
"Hoff studied Busicom's designs and concluded that it was much too complicated to be cost effective. Each calculator in the line needed one set of logic chips to perform basic mathematical functions and another to control the printers and other peripheral devices….."
"Even though some the calculator's functions would be controlled by Read Only Memories (ROMs) which sent instructions to the logic chips, and therefore enabled some of them to do slightly different tasks, the bulk of the operations would be performed by the logic ICs. Although Intel could have produced chips of the required complexity, the productive yield -- the number of working chips would have been prohibitively low."
"Busicom was looking forward and backward at the same time. The first logic chips, produced in the early sixties, contained only a handful of components. As the state of IC technology advanced, the devices became more complex and powerful, but they also became more difficult and expensive to design. Since a different set of logic chips was required for every device, the chips destined for one gadget could not be used in another. The IC companies developed various techniques to streamline the design process -- using computers to help lay out the chips, for example -- but the results were disappointing. An engineering bottleneck was developing ; the IC industry would not be able to keep up with the burgeoning demand for its components, no matter how many engineers it employed."
"Fortunately Hoff came up with a solution. Why not, he suggested, develop a general purpose logic chip, one that could, like the central processor of a computer, perform any logical task ? "
"Like a conventional central processor, the microprocessor would be programmable, taking instructions from Random Access Memories and the Read Only Memories. So if a customer (like Busicom) wanted to make a calculator, it would write a calculator program, and Intel would insert that program into ROM (Read Only Memory). Each calculator would need one microprocessor and one programmed ROM, along with several other chips… Similarly if another customer came along with plans for a digital clock, it would devise a clock program, and Intel would produce the requisite ROMs. This meant that Intel would not have to work up a new set of logic chips for every customer ; the burden of the design would be shifted to the customer and transformed into much less costly and time consuming matter of programming."
"It was a brilliant idea. Instead of twelve chips, Busicom's calculators now needed only four…Not only did Hoff's invention cut down the number of chips and therefore the number of interconnections,.. it also resulted in a much more flexible and powerful family of calculators. Literally a programmable processor on a chip, a microprocessor expands a device's capabilities at the same time as it cuts its manufacturing costs. In other words, it is one of those rare innovations that gives more for the less. Busicom accepted Hoff's scheme and the first microprocessor, designated the 4004, rolled off Intel production line in the late 1970."
"The 4004 was not a very potent computational tool. With 2250 transistors, it could process only four bits of data at a time and carry out about 60,000 operations a second. It was not powerful enough to serve as the central processor of a mini computer, but it was quite adequate for a calculator and other relatively simple electronic devices, like taximeters or cash registers. The other three chips in the set were also limited. The ROM, which contained the inner program that governed the calculator, stored 2K bits of data, and the RAM which provided the temporary storage held a mere 320 bits. Nevertheless, the four chips constituted a bona fide computer, that, mounted on a small circuit board, occupied as much space as a pocket book."
"Because Intel had developed the 4004 under contract for Busicom, the Japanese company had an exclusive right to the chip and Intel could not offer it on the open market. But in the summer of 1971, Busicom asked Intel to cut its prices -- the calculator business had become quite competitive -- and in exchange for the price reduction, Intel won the right to market the 4004. Even so the company hesitated. No one had fully grasped the enormous utility of Hoff's invention, and Intel assumed that the chip would be used chiefly in calculators and minicomputers. About 20,000 minicomputers were sold in 1971 ; at best, the 4004… would wind up in 10 percent of these machines -- a prospect that was not very interesting to small semiconductor company bent on becoming a big one."
"Although Intel did not realise it at first, the company was sitting on the device that would become the universal motor of electronics, a miniature analytical engine that could take the place of gears, axles and other forms of mechanical control. It could be placed inexpensively and unobtrusively in all sorts of devices -- a washing machine, a gas pump, a butcher's scale, a jukebox, a typewriter, a door bell, a thermostat, even, if there was a reason, a rock. Almost any machine that manipulated information or controlled a process could benefit from a microprocessor. Fortunately, Intel did not have an inkling of the microprocessor's potential. The company decided to take a chance and the 4004 and the related chips were introduced to the public in November 1971."
"Not surprisingly, the 4004 sold slowly at first, but orders picked up as engineers gained a clearer understanding of the chip's near-magical electronic properties. Meanwhile, Intel went in work on more sophisticated versions. In April 1972, it introduced the first eight bit microprocessor, the 8008, which was powerful enough to run a minicomputer…The 8008 had many technical drawbacks, however and it was superseded two years later by a much more efficient and powerful microprocessor, the now-legendary 8080. The 8080 made dozens of new products possible -- including the Personal Computer. For sseveral years Intel was the only microprocessor maker in the world and its sales soared. By the end of 1983, Intel was one of the largest IC companies in the country, with 21,500 employees and $ 1.1 billion in sales."
… Excerpted from Bit By Bit -- An illustrated History of Computers by Stan Augarten, published by Unwin Paperbacks, London (1984).
What followed thereafter was history. The 8080 was upgraded by the 8085 -- which was another marvel that made the micropro applications child's play (with a single five volts power supply). Intel's documentation designated MDS-8085 is, even today, reckoned as the prime reference material on the micropro funda.
While all this was only on the eight bit environment, Intel understood that market was indeeed amidst the 16 bit versions (and further, later). The 8086 was born. The IBM PC adopted this component. The PC market was so demanding that a simple 8086 had to be upgraded to 80286, -386 and -486. Further enhancements were to be the Pentium, Celeron and the like -- presumably this was a business decision. But there were also packaging innovations. The pinouts became simply enormous and unmanageable. An industry for microprocessors as the gear and axle for the electronic devcices had been firmly entrenched, as Stan Augarten believed.
"Bit By Bit -- is much more than a story of the machines : it is also about the brilliant, forward looking and often eccentric men and women who have shaped the computer's history -- from William Shickard, the obscure German Professor who invented the first mechanical calculator in 1623 ; to Charles Babbage, the debonair nineteenth century genius whose Analytical Engine came within an inch of being a full fledged computer ; to Stephen Wozniak, the young electronics wizard who founded Silicon Valley's Apple Computer company."
In our household, dear reader, this book is respected as much as the Gita. The children of many generations have read and reread it many times. I am afraid you may not get to reach this book easily now. That was the provocation for the extraction above. Many postulates in the book may seem obvious today. The book tells you of the visionaries who imagined the concepts for the first time. To have visualized ahead of their times -- these giants have indeed done remarkable service to the cause of what we now call the Convergence.
Today we can see farther -- because we
stand on their shoulders. I feel the single most important contribution
to the humanity of Electronics is the microprocessor. I recommend that
shall be named as the Invention of the Century.
The Editor asked -- what has impressed
as the most outstanding contribution of the twentieth century to the technologies
of the communications? This article appeared in response in the Telematics
Magazine of India 1999 -- a few months short of the close of the century.
Most of the matter was excerpted -- the job left was only that of editing
it as a coherent readable material within the space provided