Biology Reference
In-Depth Information
dS
¼
d i S
þ
d e S
(14.28)
where dS is the change in the entropy content of a nonisolated system, d i S is the
entropy increase due to irreversible processes occurring inside the system, and d e S
is the entropy change (either positive or negative) of the system due to the exchange
of matter and energy with its environment. Despite the apparent formal similarity
between Eqs. 14.26 and 14.27, there is an important difference. PDB is a difference
equation whereas the entropy balance equation of Prigogine (EBEP) is a summation
equation. Another significant difference between PDB and EBEP is that, whereas
PDB includes both entropy-producing (i.e., S + ) and entropy-consuming (i.e., S )
processes inside the system, EBEP recognizes only the entropy-producing pro-
cesses inside the system, excluding any entropy-consuming ones (e.g., biosynthesis
in organisms), and hence must be viewed as incomplete.
Perhaps one of the most novel features of Table 14.9 is the appearance of
information, I, and its cognate concepts such as communication , selection , organi-
zation, control , development, and evolution in a row that is separate from the row for
entropy , graphically illustrating the independence of information from entropy (or
vice versa) in contrast to the views of Jaynes (1957a, b); Brillouin (1953, 1956), and
others who view information and entropy as two different names for a fundamentally
identical entity. Their argument (see Volkenstein 2009, p. 63) is largely based on the
formal similarity between the mathematical expression for the thermodynamic
entropy, Eq. 14.28, formulated by Boltzmann (1844-1906), and the equation for
information proposed by Shannon (see Eq. 4.2 ) which can be reduced to Eq. 14.29.
¼
S
k ln P
(14.29)
where k is the Boltzmann constant and P is the number of the microstates compati-
ble with the microstate of the system under consideration (also called the statistical
weights of the microstates).
I
¼
K log 2 P
(14.30)
where I is the information associated with an event that can occur in P different
ways and K is a proportionality constant (Volkenstein 2009, p. 142). The formal
similarity between the equations for S and I is indeed striking. The entity defined by
Eq. 14.28 is referred to as thermodynamic entropy or the Boltzmann-Clausius
entropy (to be designated as S T ) and is defined by Eq. 14.29 as the informational
entropy, information-theoretic entropy, or the Shannon entropy (Volkenstein 2009,
p. 145) (to be designated as S I ). Whether or not S T and S I are similar not only on the
formal level (as seen above) but also on the substantial level may critically depend
on the nature of P, the number of different ways that an event can occur or the
number of different ways an object can be described D E. If it is accepted that there
are indeed three fundamental entities, D E, D S, and D I, as indicated in the second
row of Table 14.9 , and if D E and D S are associated with the First and Second Laws
Search WWH ::




Custom Search