Information Technology Reference
In-Depth Information
chips has followed this pattern for nearly 40 years, and Moore's
Law is a widely cited measure of advancement in the computer field.
The size of computer memories is dependent largely on the num
ber of transistors available on chips, so one consequence of Moore's
Law is that computer memories have indeed doubled about every 18
months since 1965. Statement 1 in the opening quiz is true.
Similarly, disk technology continues to progress rapidly. In his
1999 book, Structured Computer Organization , Fourth Edition, for
example, Andrew Tanenbaum notes that “Measuring disk improve
ment is trickier, . . . but almost any metric will show that capacities
have increased by at least 50 percent per year [since 1982]” (p. 26).
Thus, statement 2 in the opening quiz also is true.
Likewise, the capacity of a central processing unit (CPU) chip de
pends very much on the number of transistors. Applying Moore's
Law to CPUs, statement 3 in the opening quiz is true. Interestingly, a
similar trend generally applies to the speed of CPUs as well as to
their capacity, as shown in Table 6.1. From Chapter 1, we know that
clock speed is only one factor in determining how much work a CPU
can do within a given amount of time. Table 6.1 therefore can give
only a partial idea of computing power and speed. Regardless of the
details, the increases noted represent huge increases in computing ca
pability in relatively short amounts of time. The statistics come
largely from Intel ( http://www.intel.com/research/silicon/moores
law.htm) and from Andrew Tanenbaum (as cited earlier).
Table 6.1 Chips Produced by Intel
Chip
Year Introduced
Transistors
Speed (in MHz)
4004
1971
2,250
0.108
8008
1972
2,500
0.108
8080
1974
5,000
2
8086
1978
29,000
5-10
80286
1982
134,000
8-12
80386
1985
275,000
16-33
80486
1989
1,180,000
25-100
Pentium
1993
3,100,000
60-233
Pentium II
1997
7,500,000
233-400
Pentium III
1999
24,000,000
750-1,000
Pentium 4
2000
42,000,000
1,200-1,400
Search WWH ::




Custom Search