Biomedical Engineering Reference
In-Depth Information
maximize the power of experiments (or simulations) to distinguish between dif-
ferent classes of models (155,156).
Perhaps no aspect of methodology is more neglected in complex systems
science than this one. While it is always perfectly legitimate to announce a new
mechanism as a way of generating a phenomenon, it is far too common for it to
be called the way to do it, and vanishingly rare to find an examination of how it
differs from previously proposed mechanisms. Newman and Palmer's work on
extinction models (157) stands out in this regard for its painstaking examination
of the ways of discriminating between the various proposals in the literature.
7.
INFORMATION THEORY
Information theory began as a branch of communications engineering,
quantifying the length of codes needed to represent randomly varying signals,
and the rate at which data can be transmitted over noisy channels. The concepts
needed to solve these problems turn out to be quite fundamental measures of the
uncertainty, variability, and the interdependence of different variables. Informa-
tion theory thus is an important tool for studying complex systems, and in addi-
tion is indispensable for understanding complexity measures (ยง8).
7.1. Basic Definitions
Our notation and terminology follows that of Cover and Thomas's standard
textbook (158).
Given a random variable X taking values in a discrete set , the entropy or
information content H [ X ] of X is
H X
[]
w
P (
X
=
a
)log Pr(
X
=
a
)
.
[41]
2
a
H [ X ] is the expectation value of -log 2 Pr( X ). It represents the uncertainty in X ,
interpreted as the mean number of binary distinctions (bits) needed to identify
the value of X . Alternately, it is the minimum number of bits needed to encode
or describe X . Note that H [ X ] = 0 if and only if X is (almost surely) constant.
The joint entropy H [ X,Y ] of two variables X and Y is the entropy of their
joint distribution:
H XY
[,]
w
P (
X
== ==
aY
,
b
)log Pr(
X
aY
,
b
)
.
[42]
2
ab
,
The conditional entropy of X given Y is
Search WWH ::




Custom Search