Information Technology Reference
In-Depth Information
dynamics. A collection of initial states for the system is a bunch of points in phase
space, which are described by a phase-space distribution function that occupies a cer-
tain volume. Imagine that the phase space for an isolated network can be partitioned
into a large number of cells and that each cell is statistically equivalent to each of the
other cells. There is therefore an equal probability of a particle occupying any one of
the cells in phase space. The definition of entropy in this phase space is fairly abstract,
and depends on the volume of the phase space occupied by the particles.
Boltzmann's expression for entropy is
=
,
S
k B ln
(1.27)
where
is the volume of phase space occupied by the web of interest and the pro-
portionality constant k B is the Boltzmann constant. If one considers two independent
networks B 1 and B 2 with entropies S 1 and S 2 , respectively, then the entropy of the
combined network is just the arithmetical sum S 1 +
S 2 , as it would be for the energy.
The entropy is consequently extensive through the logarithmic assumption in ( 1.27 ),
which means that the measure of disorder
for the combined system
com is given
by the product of the individual volumes, that is,
indi-
cates disorder that in classical physics is due to thermal motion of the particles in the
environment.
Entropy can also be expressed in more physical terms through the use of the con-
tinuous phase-space distribution function,
com = 1 2 . The quantity
represents the phase-space
variables, the displacements and momenta describing the dynamics of the N -particle
system. The phase-space function keeps track of where all the particles in the system are
as a function of time and what they are doing. Boltzmann (1844-1906) was able to show
that the entropy could be defined in terms of the phase-space distribution function as
ρ(,
t
)
, where
k B
S
(
t
) =−
ρ(,
t
)
ln
ρ(,
t
)
d
,
(1.28)
which described the fluid-like motion of a gas, and is a non-decreasing function of
time. This definition of entropy attains its maximum value when the web achieves
thermodynamic equilibrium,
in which case the phase-space distribution function
becomes independent of time.
We refer to the definition of entropy as given by Boltzmann ( 1.28 ) as the statistical
entropy. This development reached maturity in the hands of Gibbs, who attempted to
provide the mechanical basis of the description of thermodynamic phenomena through
the formulation of statistical mechanics. Gibbs gave a probability interpretation to the
phase-space distribution function, and introduced the notion of ensembles into the inter-
pretation of physical experiments. The above statistical definition of entropy is very
general and is one of the possible measures of complexity that we seek. In fact, if the
network is simple and we are able to measure the coordinates and the momenta of all
the particles with extreme precision, we have from ( 1.28 ) that this entropy is a min-
imum. A simple web, namely one that is closed and whose equations of motion are
integrable, does not have any growth of entropy, due to the time-reversibility of the
Search WWH ::




Custom Search