Biology Reference
In-Depth Information
Table 4.3 A comparison between
entropy
and
information
Entropy (S)
Information (I)
1. Alternative
names
Thermodynamic entropy
Informational entropy (“intropy”)
Boltzmann entropy
(Boltzmann-Clausius entropy)
Information-theoretic entropy
Shannon entropy (H)
2. Recommended
names
Entropy (S)
Information (I)
Boltzmann-Clausius entropy (S)
Shannon entropy (H)
3. Mathematics
a
S
¼
klnW
I
K log P
where K
¼
(W
1)
where w
¼
negative number, and
the number of the
microscopic arrangements
¼
P
1 is the probability
associated with a message
4. Principles
obeyed
Second law of thermodynamics
“The entropy of isolated systems
increases with time.”
“Not all irreversible processes
produce information.”
“All irreversible processes
produce entropy.”
“Some irreversible processes can
decrease information.”
Yes
b
Not always
c
5. Temperature
sensitivity
6. Cross-relation
d
PinS
¼
k ln P is a form of I
e
S increase or S production is required
for I transmission
Indirect (i.e., C
¼
B log
2
(1 + P/N)
f
)
7. Relation to
energy
Direct (i.e., dS
¼
dQ/T)
8. Subsethood
g
S
I
9. Fields of study
h
Thermodynamics
Informatics
10. Common
principle
i
a
The term W cannot be less than 1 due to the constraint imposed by the Third Law of thermody-
namics which states that the entropy content of perfect crystals is zero at the zero degree of the
absolute temperature and hence S cannot be negative
b
This statement is true because all material objects have positive heat capacities
c
When one heats up a topic such as the Bible, the
thermodynamic entropy
associated with
molecular motions of the paper constituting the pages of the Bible will increase but the
informa-
tional entropy
associated with the arrangement of letters in the Bible will not be affected until the
temperature increases high enough to burn the Bible. This is thought experiment may be conve-
niently referred to as
the Bible test
d
The relation between S and I
e
The term P in the equation for S refers to the
statistical weights
P of all possible microstates and as
such represents a form of
information
different from the
information
defined by Shannon as the
logarithmic function of the
probabilities of event
P. Hence, we can recognize two kinds of
informations -
(1) the
logarithmic
or
indirect information
as defined by Shannon and (2)
nonlogarithmic
or
direct information
as perceived by the human brain directly. We may also
refer to the former as the
second-order information
and the latter as the
first-order information
f
This is the channel capacity (denoted as C) equation of Shannon (Shannon and Weaver 1949),
Eq.
4.29
, which states that no information can be transmitted when no energy is dissipated (or no
power P is expended, or no entropy is produced), i.e., when P
0. B is the bandwidth of
the communication channel. Please note that P appearing in the channel capacity equation is not
the same as the P appearing in the equation for Shannon entropy or Shannon information
(continued)
¼
0, C
¼