Biology Reference
In-Depth Information
4.3 What Is Information?
The concept of information is central not only to computer science (Wolfram 2002;
Lloyd 2006), physics (Wheeler 1990), and biology (e.g., this chapter) but also to
philosophy and theology (Davies and Gregersen 2010). Molecular machines require
both free energy and genetic information to carry out goal-directed molecular work
processes. The definition of free energy is given in Sect. 2.1.2 . In this section, the
term information is defined, primarily, within the context of molecular and cell
biology. The dictionary definitions of information include the following (except the
last two items, which are my additions):
1. Knowledge obtained from investigation, study, or instruction.
2. Intelligence, news, facts, data.
3. The attribute inherent in and communicated by one of two or more alternative
sequences or arrangements of something (such as the nucleotides in DNA and
RNA or binary digits in a computer program) that produce specific effect.
4. A quantitative measure of the uncertainty in the outcome of an experiment to be
performed.
5. A formal accusation of a crime made by a prosecuting officer as distinguished
from an indictment presented by a grand jury.
6. Anything or any process that is associated with a reduction in uncertainty about
something.
7. Information is always associated with making a choice or a selection between
at least two alternatives or possibilities.
It is generally accepted that there are three aspects to information (Volkenstein
2009, Chap. 7):
1. Amount (How much information can your USB store?)
2. Meaning (What is the meaning of this sequence of nucleotides? What does it
code for?)
3. Value (What practical effects does this nucleotide sequence have on a cell?)
All of these aspects of information play important roles in biology, but only the
quantitative aspect of information is emphasized by Shannon (1916-2001)
(Shannon and Weaver 1949) who proposed that information carried by a message
can be quantified by the probability of the message being selected from all possible
messages as shown in Eq. 4.2 :
K X
n
H
¼
p i log 2 p i
(4.2)
i
¼
1
where H is the Shannon entropy (also called the information-theoretic entropy
or intropy ) of a message source, n is the total number of messages, and p i is the
probability of the ith message being selected for transmission to the receiver.
Search WWH ::




Custom Search