Farewell to the megahertz myth

  For approximately the first 30 years of the home microcomputer industry,
the primary measuring stick for computer performance and speed was the clock
frequency used by the computer's CPU chip. For many years, this seemed like a
perfectly reasonable and natural way to gauge the relative speed of different
computers. In the early days when the single-chip semiconductor CPU had only
been invented a few years ago, CPUs were not wildly different from one
another in how they worked, and early popular CPUs like the Z80 and 6502 came
in various product models, the performance of which was distinguished, above
all else, by nothing more than their clock frequency.
  This state of affairs continued through all of the 1980s and much of the
1990s, being carried on by the reigning CPU manufacturers of the day: Intel,
AMD, and the now-defunct Cyrix. As the 1990s began to reach their end,
however, it became gradually clearer to those familiar with computer CPUs
that the clock speed of a CPU was not necessarily a baseline for actual
computing performance. CPUs were becoming more variegated in how they worked,
and there were more creative (and arguably, more effective) ways to speed up
a chip than to simply crank up the clock. This was perhaps most forcefully
demonstrated when Intel's 486 CPU gave way to its fancy new Pentium brand.
The Pentium, when it was first introduced, was available in versions with
clock speeds equal to some models of 486 chips. Yet the Pentiums which had
exactly the same clock frequency considerably outperformed their 486 kin. In
those days, the RISC-versus-CISC debate also tended to flare up in heated
debates among computer people, and while that debate has mostly subsided now,
it became clear to people paying attention that just because two CPUs had
clock signals of similar frequencies, this in no way guaranteed comparable
performance.
  I have sometimes used the analogy of engine RPM in a car. If you take a
powerful "muscle car" and rev its engine at, say, 2,500 RPM, then take a tiny
economy car and rev its engine at the same speed, which car do you think has
more power? There is a clear power difference at work, yet both engines are
revolving at the exact same speed. Despite the equivalent rotation speed, one
engine is able to do more with those 2,500 RPMs than the other.
  Those who are not particularly into cars may not find that analogy
particularly useful, but there are plenty of other more common analogies that
can be used. Try driving a nail into wood with a hammer by hitting the nail
once every 2 seconds, then try this same activity by striking the nail with
your fingertip at the same rate. The frequency of impacts upon the nail is
the same, yet one action is clearly more effective than the other. The clock
frequency is simply the rate at which an electronic square wave flowing into
a CPU's clock input pin shifts; nothing more, nothing less. It is true that
if you take a single CPU and cut the clock frequency in half, that CPU will
compute at half the speed, but if you take two different CPUs with different
internal workings, they may have equivalent computing power even if one CPU's
clock frequency is double that of the other.
  The "megahertz myth", as it came to be known, was propagated in part by
Intel's marketing department. In the late 1990s, Intel consistently produced
CPUs that were comparable in actual computing performance to those produced
by their industry rivals, yet ran at higher clock frequencies. Seeking to
gain whatever publicity advantage they could claim (marketers always do
this), Intel simply proceeded to imply that more hertz automatically equates
to a faster computer. As CPUs crossed well past the clock speed of one
gigahertz, the megahertz myth was rightly sometimes called the "gigahertz
myth" instead, but this was relatively rare, probably because it was not as
alliterative, and therefore not as catchy.
  By the time the 2000s were a few years old, however, the megahertz myth
began to lose steam, and it became clear that it was a concept doomed to
eventual rigid physical limits, such as the actual transmission speed of
electricity itself. However, this physical limit was not reached, as the
steady rise of clock speeds was eventually brought to a halt by a less rigid,
but nonetheless quite compelling barrier: Heat overload.
  In their early years, microcomputers had no cooling functionality at all.
None was needed; their CPUs did not generate nearly enough heat to warrant
even passive metal heat sinks, let alone spinning electric fans. This
continued right on up to the 486. Although people occasionally put heat sinks
or fans on the 486, this was generally not necessary. The Pentium was the
first Intel CPU that ran hot enough to require heating, starting with passive
heat sinks and shortly thereafter making the move to actual fans. As the CPUs
ran on ever higher clock speeds, the power usage--and therefore the
temperature levels--went up. As this trend continued, heat sinks and fans
both became larger. Even so, as the megahertz myth was close to the end of
its life, CPUs routinely overheated and shut down due to thermal overload,
despite their fans.
  The last generation of Intel CPUs to be made under the guideline of faster
clock frequency were of the Pentium 4 line, which peaked at 3.8 gigahertz.
The chips that actually ran at this speed required cooling fins and fans so
large that it became difficult, in many cases, to fit them into a standard
computer case. Even when this was done successfully, the CPUs could often be
made to run at an unusably high temperature by subjecting them to heavy
calculations. Clearly, the classic trick of turning up the clock frequency
had run its course, and it was no longer practical to continue the game.
  Wisely, and for the benefit of both computer users and Intel itself, Intel
backed down and charted a new course to make CPUs that were more efficient
and were able to compute faster without needing more power and more clock
cycles. The result was Intel's meritorious line of CPUs simply branded under
the name "Core 2", which produced higher performance than their Pentium 4
ancestors, yet used less power and ran at much more manageable temperatures.
They also use a slower clock frequency; the fastest Core 2 chips run at just
under 3 gigahertz, yet they outperform Pentium 4 chips running at well over
3 gigahertz.
  Speaking for myself personally, I am so glad that Intel dropped the
megahertz myth to produce better chips that run cooler and are less
power-hungry, and I suspect that most users of Intel CPUs are as well.
Perhaps now we can get back to using our machines without the temperature
alarm constantly going off and subsequently shutting down our computers. So
long, megahertz myth; goodbye, and good riddance.

    Source: geocities.com/siliconvalley/2072

               ( geocities.com/siliconvalley)