Department of Physics,
University of Kerala
Kariavattom 695 581,
Thiruvanathapuram
The
evolution of modern electronics from vacuum tubes to microelectronics through transistors
and integrated circuits and the current developments towards nanoelectronics are outlined
in this article.
1. INTRODUCTION
Electronics is the science
and technology of the motion of charges in a gas, vacuum or semiconductor. The history of
electronics can be divided into three major periods the vacuum tube era, the
transistor era, and the currently developing nanoelectronics era.
2. VACUUM TUBE ERA
In 1904, Fleming invented
a two-element device, the diode, which he called the valve. The invention of audion (triode) by deForest in
1906 can be considered as the beginning of the modern electronics. The first applications
of vacuum tubes were to telephone and radio
communications. The first broadcast radio systems used amplitude modulation. To
improve fidelity and reduce the effect of atmospheric interference, Armstrong developed
frequency modulation.
The
techniques used in radio broadcasting were adapted to fit other applications. Telephone systems were transformed into one of
major forms of electronic communication. Radar and Loran, developed during World War II,
utilized radio communication as aids in both air and sea navigation.
Black
and white television began in 1930 based on Zworykins ionoscope (camera) and
kinescope (picture tube). World War II delayed the widespread use of television. The development of colour television began about
1950 and became dominant during 1960s.
1950s marked the end of
the development of vacuum tube systems and the beginning of the transistor age. Today, the semiconductor devices except for
high-power applications dominate the entire field.
3. TRANSISTOR ERA
The age of semiconductor
electronics began with the invention of the transistor in 1948. Vacuum tubes had major
limitations power was consumed even when they were not in use and filaments burned
out, requiring tube replacement.
Brattain and Bardeen
performed an experiment in December 1947. They placed two gold-wire probes closely and
pressed into the surface of a germanium crystal. They observed that the output voltage at
the collector probe with respect to the germanium base was greater
than the input voltage to the emitter probe.
Thus, the solid-state amplifier in the form of point-contact transistor was born.
The performance of the first transistors was very poor. They had low gain and bandwidth
and were noisy, and their characteristics varied widely from device to device.
Shockley recognized that the difficulties were with
the point contact. He proposed the junction
transistor and developed the theory of its operation. The new devices depend on charge
carriers of both polarities (viz., electrons and holes), and hence are known as bipolar
devices. Bardeen, Brattain and Shockley were awarded Nobel Prize in Physics in 1956 for
their invention of the transistor and contributions to the understanding of
semiconductors.
Integrated
Circuit (IC)
Kilby of Texas Instruments
conceived the idea of using germanium or silicon to build an entire circuit in 1958.
Resistors were to be formed with the bulk semiconductor or by diffusing one semiconductor
into another. Using a metallic layer and the semiconductor for the plates and oxide layer
for the dielectric, Kilby formed capacitors. To demonstrate his concept, he built an
oscillator and a multivibrator from germanium. He announced his solid circuit (later
called IC) at the IRE convention in 1959.Noyce also had the monolithic-circuit idea for
making multiple devices on a single piece of silicon in order to make inter-connections
between devices as part of the manufacturing process and thus reduce size, weight, etc.,
as well as the cost per active element.
The real key to IC
manufacture was the planar transistor and batch processing.
The planar process used transistors in which the base and emitter regions were
diffused into the collector. Hoerni at
Fairchild (1958) developed the first diffused transistors.
The fabrication techniques used were production lithography and the diffusion
process developed earlier by Noyce and Moore. Batch
processing permitted many IC chips to be made from a single silicon wafer. By 1961, both Fairchild and Texas Instruments were
producing ICs commercially, followed by other companies.
Today, in addition to
individual circuits, sub-systems and even entire systems containing thousands of
components can be fabricated on a single silicon chip.
The term microelectronics
refers to the design and fabrication of these high component densities ICs. More noted in
1964 that the number of components on a chip had doubled every year since 1959, when the
planar transistor was introduced.
The increase in component
density owes much to those who improved fabrication processes. These advances include epitaxial growth, electron
beam mass production and ion implantation. Another area of contributions to reliable IC
design and production was the development of computer-aided design (CAD) and automated
testing.
SPM possibilities: The invention of
scanning probe microscope (SPM) has led to the possibility of using a scanned tip to
write patterns on a silicon chip at the nanometer level. It has been shown
that STM tip can be used to write patterns in a polymer resistor, which can be
subsequently, be replicated in metal. Gold lines as narrow as 20 nm have been produced in
this way. Basic feasibility of the generation
of lines has thus been demonstrated but a great deal of research needs to be done before
this method could be used to make chips.
SCANNING
PROBE MICROSCOPY (SPM)
In 1873, Abbe showed that the minimum feature that can be
resolved in a conventional optical system is determined by a simple relationship involving
aperture of the system and wavelength of the light used. By this formula, the resolution
of a conventional optical microscope is limited to about 1 mm. The Transmission Electron Microscope (TEM), which was
invented in 1931 by Ernst Ruska and Max Knoll, uses electrons accelerated to an energy of
hundreds of keV, when they have an equivalent wavelength of about 1 pm. And, because of
this short wavelength, resolutions of 0.1 nm, (i.e., atomic resolution) can be achieved.
The principle of making a microscope by
scanning a physical probe across the surface of an object and measuring a consequential
effect due to its features has been known for many years. Probably first of these
microscopes was the scanning electron microscope (SEM).
Here the resolution is limited by the electron beam spot size, which in turn is
limited by the electron source and electromagnetic lens aberrations.
Synge made the first suggestion of a form of
super-resolution microscope in 1928. He
suggested that it might be possible to fabricate a tiny, sub-wavelength sized aperture at
the end of a glass rod and, by sending light down the rod and scanning the tip across a
surface, build up an image of a surface with a resolution equivalent to the size of the
aperture.
The Scanning Tunneling Microscope (STM) uses a similar
principle. Here, the evanescent wave is an electron wave function with an intrinsic
wavelength of 1 nm, which extends beyond the surface of a sharp metal tip. If a conducting surface is brought to within about
1 nm of the tip and a potential difference is applied between them, a tunneling
current will flow. The magnitude of this current is an exponentially decaying
function of distance and is dependent on the difference between the work functions of the
two materials. Thus, information can be
derived about both the topography of the surface and its chemical make-up.
Several extensions to STM techniques have been
demonstrated, including Scanning Noise Microscopy, in which no bias field is applied to
the tip, but broadband rms noise (which is proportional to the tip-to-sample resistance)
is used to control the tip-to-sample distance.
The limitation of STM is that it can only work with
conducting surfaces. The Scanning Force Microscope or Atomic Force Microscope (AFM) was
developed to overcome this limitation. Other SPMs include Scanning Thermal Microscopy,
which detects the flow of heat from the tip to the sample using a fine thermocouple as the
probe; and Scanning Near Field Optical Microscopy (SNOM). In SNOM, an optical fibre is
drawn to a fine tip. The light reflected from the tip of the fibre as it is scanned is
used to reconstruct the surface image.
4. TOWARDS NANOELECTRONICS
Biology is often used both
as an example and proof that molecular-based or bottom-up nanotechnologies are
possible. Biological examples can easily be
found that demonstrate molecular-based information storage, information processing,
molecular replication and nanoscale mechanical motors. The
pre-programmed sequences of amino acids and nucleotides in protein and nucleic
acid molecules represent routine information storage at a density far greater than
currently achievable by modern technologies. The transcription of a single DNA sequence
into multiple RNA molecules and their subsequent translation into protein molecules
represents molecular processing of information and molecular replication. Whilst these
biological examples have been designed by millions of years of evolution to fulfil
particular roles, it is hoped that a rational-design approach based-upon our developing
under-standing can improve upon nature and overcome the limitations of natural biological
systems such as (a) poor performance at elevated temperatures, (b) poor structural
performance (for some applications) and (c) a limited range of naturally occurring
starting materials.
A nanometer is 10-9 metre, one billionth of a
metre, about 80,000 times less than the diameter of an average human hair and 10 times the
diameter of a hydrogen atom. The term nanotechnology is frequently used to
generally describe the science of atomic scale phenomena. It is defined by the journal
Nanotechonology as all those technologies and enabling technologies associated with
either top-down approach to fabricating miniature elements by sculpturing the desired
structure from a microscopic piece of material as well as the bottom-up approach of
building the desired structure molecule by molecule or atom by atom.
Nanotechnology brings together engineering, physics,
chemistry and biology. It includes materials processing through removal, accretion,
surface transformation, joining and assembly right down to materials identification,
manipulation and assembly of individual molecules. Many very high resolution techniques
such as x-ray, electron beam and scanning probe microscopy (SPM), etc., have been
developed for the research of physical phenomena of matter at the sub-nanometer and atomic
scale. Through these physical techniques, analysis and even manipulation of structures at
this scales have become possible
Nanotechnology is a group of generic technologies that are
becoming crucially important to many industrial fields and offering great promise of
massive improvements to standards of living throughout the world. It is also a new way of thinking about possible
solutions to problems currently obstructing developments that can enhance the welfare of
mankind. The main driving forces in this broad field from micro to nano systems are:
·
new products that can work only on
very small scale or by virtue of ultra precision technologies
·
higher systems performance
·
miniaturisation, motivated by
small, faster, cheaper
·
higher reliability, and
·
lower cost
Biosensors: The demand for real-time information
for efficient control of processes dictates in situ measurement
and therefore sensor-type approaches with a small size, a common requirement for practical
use. Nanotechnology impacts upon this situation via (a) the microengineering of sensors
and (b) in the design and control required at molecular scales to transduce measurable
parameters into easily detectable, typically electronic, signals.
Biosensors exploit the
exquisite selectivity exhibited by biological systems that enables the recognition of one
molecular species in the presence of complex mixtures of other, often closely related
molecular species found commonly in sample matrices. The integration of biological
systems, typically enzymes and antibodies, with suitable physical transducers enables the
transduction of bio-recognition events into measurable signals. For example, systems based
upon enzymes that perform oxidation and reduction reactions redox
enzymes such as glucose oxidise for the determination of blood glucose levels
require the efficient communication of electrons generated at the enzymes active
site during the recognition and catalytic events to an electrode for measurement. Various approaches are being developed to enable
direct electronic communication between the active site and electrode with molecular
wiring. The means of integration of biological component of biosensors is central to
their eventual performance and advances in surface science such as self-assembled systems
are being actively investigated to ease production of devices, increase device stability,
reduce interferences and maximise signal out put.
Biochips: Many examples exist where large
numbers of individual biological analysers, i.e., biological assays, commonly 103
to 106, need to be performed and include the screening of libraries of
potential pharmaceutical compounds and various protocols for the screening and sequencing
of genetic material. Such large numbers dictate the parallel processing of assays to
enable completion in reasonable time scales and the common availability of only small
sample quantity dictates small size. Thus, microfabricated high-density arrays of
biosensors called biochips, meaning an integration of biology with microchip type
technologies. For example, devices are being developed for genetic screening that contain
two dimensional arrays with greater than 105 elements each comprising a
differing DNA sequence and where each element is optically examined for specific
interaction with complementary genetic material.
DNA based computing: A novel technological
application of nanoscale biology is the recent demonstration of DNA-based computing, where
the tools of molecular biology were used to solve a mathematical problem. The problem was
encoded in the sequences of DNA molecules, with the operations of the computation
performed by the self-assembly of complementary DNA sequences and the answer (output) by
characterising the resulting self-assembled DNA, i.e., hybridized DNA, with standard
molecular biology tools. Although the
calculation required days of laboratory work, the parallel nature of the
computation and scale-up possibilities suggest further developments of this
method.
5. CONCLUSION
The evolution towards the
nanometric age is gathering pace. It
has the potential to provide revolutionary benefits.
1. Special Issue: IEEE Trans. Electron Devices, ED-23 ( no.7) (1976)
2. Special Issue: Scientific American, (September 1977)
3. Binning G, Rohrer H,Gerber Ch, and Weibel E, Phys. Rev. Lett. 50 (1983) 120-123.
4. Albrrecht T R and Quate C F, J Vac Sci Tech, A6 (1988) 271-274.
5. Hutcheson G D and Hutcheson J D, Scientific American (January 1996) 40-46.
6. Stix G, Scientific American (February 1995) 72-78.
7. Aizawa M, IEEE Engg in Medicine and Biology, 13 (1994) 94-102.
8. Heller A, Current Opinion in Biotechnology, 7 (1996) 50-54.
9. Chee, M, et
al, Science, 274 (1996) 610-614.
10. Gibbons A, Amos M and Hodgson D, Current Opinion in Biotechnology, 8 (1997) 103-106.
11. Drexler K E, Nanosystems molecular machinery, manufacture and computation (Wiley Interscience, New York, 1992)
Go to: resources