Clock rate

The clock rate is the fundamental measurement in cycles per second (Hertz) at which a computer performs its most basic operations such as adding two numbers or transferring a value from one to another.

Description
The clock rate of a computer is normally determined by the frequency of a crystal. The original IBM, circa 1981, had a clock rate of 4.77 MHz (almost five million cycles/second). By 1995, Intel's processor runs at 100 MHz (100 million cycles/second). The clock rate of a computer is only useful for providing comparisons between computer chips in the same processor family. An IBM PC-compatible with an CPU running at 50 MHz will be about twice as fast as one with the same CPU, memory and display running at 25 MHz. There are many other factors to consider when comparing different computers.

Clock rates should not be used when comparing different computer architectures or different processor families. Rather, some benchmark should be used. Clock rates can be very misleading, since the amount of work different computer chips can do in one cycle varies. For example, RISC CPUs tend to have simpler instructions than CISC CPUs (but higher clock rates) and processors execute more than one instruction per cycle.