Hardware Reference
In-Depth Information
As the market for computers exploded dramatically in the 1970s and computing
capabilities grew rapidly, the demand for low-cost computers favored designs of
computers using interpreters. The ability to tailor the hardware and the interpreter
for a particular set of instructions emerged as a highly cost-effective design for
processors. As the underlying semiconductor technology advanced rapidly, the ad-
vantages of the cost outweighed the opportunities for higher performance, and
interpreter-based architectures became the conventional way to design computers.
Nearly all new computers designed in the 1970s, from minicomputers to main-
frames, were based on interpretation.
By the late 70s, the use of simple processors running interpreters had become
very widespread except among the most expensive, highest-performance models,
such as the Cray-1 and the Control Data Cyber series. The use of an interpreter
eliminated the inherent cost limitations of complex instructions so designers began
to explore much more complex instructions, particularly the ways to specify the
operands to be used.
This trend reached its zenith with Digital Equipment Corporation's VAX com-
puter, which had several hundred instructions and more than 200 different ways of
specifying the operands to be used in each instruction. Unfortunately, the VAX ar-
chitecture was conceived from the beginning to be implemented with an inter-
preter, with little thought given to the implementation of a high-performance
model. This mind set resulted in the inclusion of a very large number of instruc-
tions which were of marginal value and difficult to execute directly. This omission
proved to be fatal to the VAX, and ultimately to DEC as well (Compaq bought
DEC in 1998 and Hewlett-Packard bought Compaq in 2001).
Though the earliest 8-bit microprocessors were very simple machines with
very simple instruction sets, by the late 70s, even microprocessors had switched to
interpreter-based designs. During this period, one of the biggest challenges facing
microprocessor designers was dealing with the growing complexity made possible
by integrated circuits. A major advantage of the interpreter-based approach was
the ability to design a simple processor, with the complexity largely confined to the
memory holding the interpreter. Thus a complex hardware design could be turned
into a complex software design.
The success of the Motorola 68000, which had a large interpreted instruction
set, and the concurrent failure of the Zilog Z8000 (which had an equally large in-
struction set, but without an interpreter) demonstrated the advantages of an inter-
preter for bringing a new microprocessor to market quickly. This success was all
the more surprising given Zilog's head start (the Z8000's predecessor, the Z80, was
far more popular than the 68000's predecessor, the 6800). Of course, other factors
were instrumental here, too, not the least of which was Motorola's long history as a
chip manufacturer and Exxon's (Zilog's owner) long history of being an oil com-
pany, not a chip manufacturer.
Another factor working in favor of interpretation during that era was the exist-
ence of fast read-only memories, called control stores , to hold the interpreters.
 
Search WWH ::




Custom Search