Information Technology Reference
In-Depth Information
possible by the powerful computational tools available to them. Bioinformatics
has emerged as the science of the 21st century, requiring the contributions of
truly interdisciplinary scientists who are equally at home at the lab bench or
writing software at the computer.
However, the seeds of the relationship between biology and computer science
were sown long ago, when the latter discipline did not even exist. When, in
the 17th century, the French mathematician and philosopher Rene Descartes
declared to Queen Christina of Sweden that animals could be considered a class
of machines, she challenged him to demonstrate how a clock could reproduce.
Three centuries later, with the publication of The General and Logical Theory
of Automata [19] John von Neumann showed how a machine could indeed
construct a copy of itself. Von Neumann believed that the behavior of natural
organisms, although orders of magnitude more complex, was similar to that of
the most intricate machines of the day. He believed that life was based on logic.
In 1970, the Nobel laureate Jacques Monod identified specific natural pro-
cesses that could be viewed as behaving according to logical principles: “The
logic of biological regulatory systems abides not by Hegelian laws but, like
the workings of computers, by the propositional algebra of George Boole” [16,
p. 76; see also 15].
The concept of molecular complexes forming computational components
was first proposed by Richard Feynman in his famous talk “There's Plenty
of Room at the Bottom” [11]. The idea was further developed by Bennett [6]
and Conrad and Liberman [9], and since then there has been an explosion of
interest in performing computations at a molecular level. In 1994, Adleman
showed how a massively parallel random search may be implemented using
standard operations on strands of DNA [1; see also 2]. Several authors have
proposed simulations of Boolean circuits in DNA [3, 17], and recently the
regulation of gene expression in bacteria has been proposed as a potential in
vivo computational framework. We now discuss this last development in more
detail.
BACKGROUND
Although proposed by Feynman [11] as long ago as 1959, the realization of per-
forming computations at a molecular level has had to wait for the development
of the necessary methods and materials. However, a rich body of theoretical
work existed prior to Adleman's experiment. In 1982, Bennett [6] proposed
the concept of a “Brownian computer” based around the principle of reactant
molecules touching, reacting, and effecting state transitions due to their random
Brownian motion. Bennett developed this idea by suggesting that a Brownian
Turing machine could be built from a macromolecule such as RNA. “Hypo-
thetical enzymes,” one for each transition rule, catalyze reactions between the
Search WWH ::




Custom Search