Information Technology Reference
In-Depth Information
work of William Smith, the founder of stratigraphy - the branch of geology
that studies rock layers and layering ( Fig. 1.1 ). While the layering approach
used in computer science was not inspired by geological layers, Feynman's
analogy serves as a useful memory hook for explaining hierarchical layers of
computer architecture by reminding us that we can examine and understand
things at each level ( Fig. 1.2 ). This is the key insight that makes computers
comprehensible.
Universality is linked to the notion of a universal computer that was intro-
duced by Alan Turing and others. Turing suggested a very simple model for a
computer called a Universal Turing Machine. This uses instructions encoded
on a paper tape divided into sections with a very simple set of rules that the
machine is to follow as the instruction in each section is read. Such a machine
would be horribly inefficient and slow at doing complex calculations; more-
over, for any specific problem, one could design a much more efficient, special-
purpose machine. Universality is the idea that, although these other computers
may be faster, the Universal Turing Machine can do any calculation that they
can do. This is known as the Church-Turing thesis and is one of the corner-
stones of computer science. This truly remarkable conjecture implies that your
laptop, although much, much slower than the fastest supercomputer, is in
principle just as powerful - in the sense that the laptop can do any calculation
that can be done by the supercomputer!
So how did we get to this powerful laptop? Although the idea of powerful
computational machines dates to the early nineteenth century, the direct line
to today's electronic computers can be traced to events during World War II
(1939-1945).
Fig. 1.1 The famous geological map of
Great Britain devised by William “Strata”
Smith (1769-1839). Smith was a canal
and mining engineer who had observed
the systematic layering of rocks in the
mines. In 1815, he published the “map
that changed the world” - the first large-
scale geological map of Britain. Smith
was first to formulate the superposition
principle by which rocks are successively
laid down on older layers. It is a similar
layer-by-layer approach in computer
science that allows us to design complex
systems with hundreds of millions of
components.
A chance encounter
There are many detailed histories of the origins of computing, and it would
take us too far from our goal to discuss this history in detail. Instead, we will
concentrate only on the main strands, beginning with a chance meeting at a
train station.
In 1943, during World War II, the U.S. Army had a problem. Their Ballistic
Research Laboratory (BRL) in Aberdeen, Maryland, was falling badly behind
in its calculations of firing tables for all the new guns that were being pro-
duced. Each new type of gun needed a set of tables for the gunner that showed
the correct angle of fire for a shell to hit the desired target. These trajec-
tory calculations were then being carried out by a machine designed by MIT
Professor Vannevar Bush. This was the differential analyzer ( Fig. 1.3 ). It was
an analog device, like the slide rules that engineers once used before they
were made obsolete by digital calculators, but built on a massive scale. The
machine had many rotating disks and cylinders driven by electric motors and
linked together with metal rods, and had to be manually set up to solve any
specific differential equation problem. This setup process could take as long
as two days. The machine was used to calculate the basic trajectory of the
shell before the calculation was handed over to an army of human “comput-
ers” who manually calculated the effects on this trajectory of other variables,
such as the wind speed and direction. By the summer of 1944, calculating
Fig. 1.2 This sponge cake is a further
analogy of abstraction layers. It is most
certainly more appealing to our senses
than the rock layers of geological
periods.
Search WWH ::




Custom Search