CPU Speeds
Imagine an office
worker is paid for the number of "sessions" he works. Being devious, he discovers a way to make
twice the money. Instead of one 8-hour
session each day, he comes in for two 4-hour sessions. As inane and obvious as this sounds, it's
what's being doing in the computer industry right now, and it's working. When you hear something like "This is a
900 megahertz machine.", that means the CPU clock speed is 900 million
cycles per second. This measure, the
"speed" of a computer, is what the general consumer has come to measure
a computer by. So, like the devious
office worker, computer manufacturers have started to find ways to increase the
clock speed of their computers, even if it means reducing the actual
performance of their computer. The
office worker got away with it because the boss wasn't checking how long the
sessions were. Similarly, the computer
manufacturers are getting away with it because the computer-buying masses
aren't considering how much gets done during each clock cycle. It's bad enough that the office worker is
getting paid twice as much, but he's also probably only doing half as much work
as before. Each session has a fixed
amount of overhead time at the beginning and end and the real work only happens
in the middle of the session. This same
problem plagues computer chips that have been given shorter work sessions in
order to increase the total number of work sessions. They suffer from spending most their time on overhead, and the
actual performance goes way down.
The first to use
this cheap trick was Intel. To get
their chips up in the 1gigahertz range, they reduced the work done each clock
cycle to the smallest amount possible.
This greatly hurt their performance, and AMDs chips, which did a more
optimal amount of work each clock cycle, greatly outperformed the Intel
chips. But, the average, naive consumer
drove the market, and Intel's was taking sales from AMD at an alarming
rate. The average consumer saw 1100
megahertz on the Intel chip and 800 on the AMD chip and were falsely led to
believe the Intel chip was "faster".
What's worse, AMD soon had to start using the same cheap trick to keep
their own image up to par. So, their
next generation of chips were also crippled to get their clock speed to be as
high a figure as possible.
John LeFlohic
March 23, 2002