Information Technology Reference
In-Depth Information
Developments in computer architecture: RISC and ARM
The idea of RISC - standing for Reduced Instruction Set Computing - originated on the East and West coasts
of the United States at around the same time in the early 1980s. At IBM Research in the 1970s and 1980s, John
Cocke ( B.8.19 ) had investigated how often the individual instructions of an instruction set were actually exe-
cuted when running a representative set of programs. He discovered that a small set of instructions occurs
more frequently than others and proposed that only this reduced instruction set should be implemented in
hardware. The more complex instructions of the standard approach can then be built up out of this smaller
set. Having only a small instruction set simplifies the circuit design and enables us to build fast computers
with low power consumption.
On the West Coast, David Patterson at Berkeley and John Hennessy at Stanford were pursuing similar
ideas. It was Patterson who coined the name RISC, for reduced instruction set computing architecture - in con-
trast to the usual complex instruction set computing (CISC) architecture of a standard microprocessor. The first
RISC processor was introduced in an experimental research computer called the IBM 801 in 1980. Cocke's
ideas made their way into the IBM POWER architecture - an acronym for Performance Optimization With
Enhanced RISC. This led to the introduction of IBM's RS/6000 (RS for RISC System) workstations in 1990.
Cocke's colleague, Fran Allen, worked with him on the interaction of computer architectures and compilers
and they were responsible for developing many innovative compiler optimization techniques (see B.8.20 ).
In recent years there has been a coming together of RISC and CISC. The new microprocessors of Intel's x86
series externally support a CISC instruction set of almost nine hundred instructions, but internally only a
RISC subset of instructions are actually implemented in silicon.
For smart phones and tablets, power consumption and battery life is very important. The United
Kingdom-based company ARM Holdings - ARM standing for Advanced RISC Machines - had its origins in the
Acorn computer company. In the United Kingdom, Acorn had great success with a personal computer called
the BBC Microcomputer ( Fig. 8.26 ). In looking for a microprocessor for their next generation machine, they
took the unusual step of deciding to design their own microprocessor. Herman Hauser, the CEO of Acorn,
encouraged two of his engineers, Steve Furber and Sophie Wilson, to look at the Berkeley RISC papers and
then sent them on a fact finding visit to the United States. They visited Bill Mensch, CEO of the Western
Design Center in Phoenix, Arizona, and were amazed at the tiny scale of his globally successful operation.
As Wilson tells it: “A couple of senior engineers, and a bunch of college kids . . . were designing this thing. . . .
B.8.19. John Cocke (1925-2002) received a BS degree in mechanical engineering from Duke University in
1946 and later went back to Duke to complete a PhD in mathematics in 1956. He then joined IBM
Research where he remained for the rest of his working life. At a symposium in honor of John Cocke in
1990, Fred Brooks described him as a “fire starter” because of his constant stream of ideas: “The
metaphor that comes to mind is of a man running through a forest with flint and steel, striking sparks
everywhere.” F5 After working on IBM's Stretch project, an ambitious effort to build the fastest scientific
computer, and the Advanced Computer Systems research project, in 1975 Cocke led the research team
building the experimental IBM 801 computer, which pioneered the ideas of RISC architectures and
optimizing compiler technology. In the 1980s, these ideas led to the IBM POWER architecture and the
RS/6000 RISC workstations. In 1987, Cocke received the Turing Award for the development of RISC and
for his work on optimizing compilers with Fran Allen. In his 1990 talk, Fred Brooks characterized
Shannon, von Neumann, and Aitken as the three “greats” of the first generation of computer scientists;
and Knuth, Sutherland, and Cocke as the three “greats” of the next generation.
 
Search WWH ::




Custom Search