Robotics Reference
In-Depth Information
more powerful than the mainframe computers used in 1969 to place
man on the moon.
In 1997 Moore predicted that it would be another twenty years before
transistor miniaturisation reached its physical limits. Later he suggested
that his prediction would hold until at least 2025, by when he expected
each computer chip to hold one billion transistors. But by the time that
the speed of computers can no longer be increased significantly by in-
creasing the number of transistors on a silicon chip, there will already be
other technologies enabling even faster computing. Some of these new
technologies as discussed by Michio Kaku of the City College of New
York in his book Visions: How Science Will Revolutionize the 21 st Century .
They include the optical computer, the DNA computer, the molecular
computer and the quantum computer. Kaku believes that these are likely
to become realistic possibilities near the end of the twenty-first century
and that, of the four, the optical computer is the leading candidate be-
cause so much knowledge of optical technology is being uncovered in the
telecommunications explosion. Molecular computers and DNA-based
technology that mimic our own genes are also interesting areas of scien-
tific exploration, even though, in Kaku's opinion, their feasibility is more
distant than optical computing.
The Optical Computer
The computers of today are silicon based. They employ logic gates 4 to
convert electrical voltages into the various logic functions employed in
Boolean algebra. 5
The speed of silicon based computers is limited by the speed with
which they can transfer data and the speed with which that data can be
4 Physically, a logic gate is a transistor circuit that either allows voltages to pass through the gate or
prevents voltages from passing through the gate, depending on the simple rules of logic and Boolean
algebra (see Chapter 1). From the perspective of Boolean algebra, a logic gate is an electronic circuit
whose output state (1 or 0) depends on the specific combination of the states of the various input
signals into the gate. For example, based on the rule of Boolean algebra prescribing that “A and B” is
true if and only if both A is true and B is true, so an and logic gate has an output of 1 (equivalent to
“true”) if and only if all of its inputs are 1; otherwise the output of that and gate is 0 (equivalent to
“false”). And just as Boolean algebra prescribes that “A or B” is true if A is true or B is true (or both
are true), so an or logic gate has an output of 1 (true) if any of its inputs is 1, otherwise its output
state is 0 (false). The speed of computation in a computer is closely related to the speed with which
the transistors in logic gates can switch back and forth, which explains why the size of transistors
(and hence their speed) is such a crucial factor in the speed of computer processing.
5 See the section “Early Logic Machines” in Chapter 1.
Search WWH ::




Custom Search