Information Technology Reference
In-Depth Information
The H-theorem was proved by Boltzmann in 1872. It introduces the function
H , which appears to predict an irreversible increase in entropy, despite the micro-
scopically reversible dynamics of thermodynamical systems. This conclusion was
considered paradoxical by many authoritative thinkers of that time. But many criti-
cisms missed some deep aspects of the question. The main point is that in a popula-
tion dynamics, resulting from a huge number of individual dynamics, some proper-
ties emerge which are new with respect to the simple additions of the individual ef-
fects. In other words, in molecule populations a temporal asymmetry emerges which
breaks the time symmetry. Today, many proofs are available of the H theorem. In
this section we will define and interpret some numerical experiments which confirm
Boltzmann's conclusion and suggest the statistical and informational character of
the thermodynamical arrow of time.
Given a partition of an interval in the real line, into subintervals of size
τ
(the discretization approximation), we call (discrete) distribution any finite pop-
ulation of (real) values, which are considered indiscernible when they belong to
the same subinterval of the partition. A distribution, can be considered a (dis-
crete) random variable where the internal multiplicities (how many values fall in
each interval) provide the probability distribution of the random variable. In fact,
a distribution of interval values v 1 ,
v 2 ,...,
v m with multiplicities k 1 ,
k 2 ,...,
k m re-
spectively, where k
=
k 1 +
k 2 + ... +
k m , determines the probability distribution
k .
Given a value distribution, then (discrete) Boltzmann's function H for this pop-
ulation has the following definition, where m is the number of subintervals which
partition the distribution interval, and n i is the number of values belonging to the
i
p 1 =
k 1 /
k
,
p 2 =
k 2 /
k
,...,
p m =
k m /
th subinterval.
m
i = 1 n i lg n i .
The H function is related to Shannon's entropy S (already introduced by Gibbs
for thermodynamical systems), and apart from additive and multiplicative constants
they are functions of the same kind, but with opposite signs. Therefore if H de-
creases, then S increases. Both H and S easily extend to continuous variables and
to continuous probability distributions by replacing sums with integrals. Very often
symbol H denotes Shannon entropy. To avoid confusion, in this section, we reserve
symbol H to Boltzmann's function and S to Shannon's entropy.
Let us consider an ideal gas where molecules can be assimilated to balls and
their collisions are elastic collisions where the impulse and energy conservation
laws hold. We can describe the evolution of such a gas in a simplified abstract two-
dimensional setting. In fact, let us start from a distribution of values representing
molecule velocities. Then, apply the following evolution rules which abstractly cor-
respond to molecule collisions.
It is easy to realize that the game described in Table 4.6 represents, in a popula-
tion of molecules, the collision of two molecules (two-dimensional balls), along
a collision line (passing through the centers of the two balls), which, after the
H
=
 
Search WWH ::




Custom Search