Hardware Reference
In-Depth Information
that served only a few users. Fortunately, silicon fabrication companies could still
produce faster, lower-power and less expensive chips for applications that needed
millions of chips. But, for applications with only a few users, such as prototyping,
low-volume design applications, and education, FPGAs remain a popular tool for
building hardware.
Up until 1992, personal computers were either 8-bit, 16-bit, or 32-bit. Then
DEC came out with the revolutionary 64-bit Alpha, a true 64-bit RISC machine
that outperformed all other personal computers by a wide margin. It had a modest
success, but almost a decade elapsed before 64-bit machines began to catch on in a
big way, and then mostly as high-end servers.
Throughout the 1990s computing systems were getting faster and faster using a
variety of microarchitectural optimizations, many of which we will examine in this
book. Users of these systems were pampered by computer vendors, because each
new system they bought would run their programs much faster than their old sys-
tem. However, by the end of the 1990s this trend was beginning to wane because of
two important obstacles in computer design: architects were running out of tricks
to make programs faster, and the processors were getting too expensive to cool.
Desperate to continue building faster processors, most computer companies began
turning toward parallel architectures as a way to squeeze out more performance
from their silicon. In 2001 IBM introduced the POWER4 dual-core architecture.
This was the first time that a mainstream CPU incorporated two processors onto
the same die. Today, most desktop and server class processors, and even some em-
bedded processors, incorporate multiple processors on chip. The performance of
these multiprocessors has unfortunately been less than stellar for the typical user,
because (as we will see in later chapters) parallel machines require programmers to
explicitly parallelize programs, which is a difficult and error-prone task.
1.2.6 The Fifth Generation—Low-Power and Invisible Computers
In 1981, the Japanese government announced that they were planning to spend
$500 million to help Japanese companies develop fifth-generation computers,
which would be based on artificial intelligence and represent a quantum leap over
''dumb'' fourth-generation computers. Having seen Japanese companies take over
the market in many industries, from cameras to stereos to televisions, American
and European computer makers went from 0 to full panic in a millisecond, de-
manding government subsidies and more. Despite lots of fanfare, the Japanese
fifth-generation project basically failed and was quietly abandoned. In a sense, it
was like Babbage's analytical engine—a visionary idea but so far ahead of its time
that the technology for actually building it was nowhere in sight.
Nevertheless, what might be called the fifth generation did happen, but in an
unexpected way: computers shrank. In 1989, Grid Systems released the first tablet
computer, called the GridPad. It consisted of a small screen on which the users
could write with a special pen to control the system. Systems such as the GridPad
 
 
Search WWH ::




Custom Search