Information Technology Reference
In-Depth Information
dynamical equations. Here the terms “integrable” and “time-reversible” dynamics are a
consequence of Newton's laws of motion.
Wiener [ 39 ] and subsequently Shannon [ 29 ] determined how to construct a formal
measure of the amount of information contained by a web and the problems associated
with the transmission of a message within a web and between webs. Shannon expressed
information in terms of bits, or the number of binary digits in a sequence. He proved
that a web with N possible outputs, where the output i has the probability of occurring
p i , can be described by a function
N
=−
p i log 2 p i ,
H
(1.29)
i
=
1
which is the information entropy, with the logarithm to the base two and Boltz-
mann's constant set to unity. The information entropy H attains its maximum value
at the extremum where its variation denoted by the operator
δ
vanishes subject to the
normalization condition
H
N
1
δ
+ λ
p i
=
0
(1.30)
i
=
1
and
is a Lagrange multiplier. The solution to ( 1.30 ) yields the maximum informa-
tion entropy when each of the possible states of the web has the same probability of
occurrence, that is when maximal randomness (maximum uncertainty) occurs,
λ
p i
=
1
/
N
.
(1.31)
In this case of maximal randomness the entropy
H max =
log 2 N
(1.32)
is also maximum.
The information entropy is the discrete equivalent of Gibbs' treatment of Boltzmann's
entropy, as Shannon discusses at the end of his 1948 article. The analytic expressions
for the entropy, disregarding the unit of measure and therefore the base of the loga-
rithm, are exactly the same, but this informational interpretation offers possibilities of
extending the definition of entropy to situations using conditional probabilities, result-
ing in conditional entropies, mutual entropies, and so on. This means that it is possible
to recognize two equivalent pieces of information, and to disregard the “copy” because
nothing new is learned from it. It is possible to extract the new pieces of information
from a message of which the majority of the content is already known, and therefore
it is useful for separating the knowable but unknown from the known. New pieces of
information decrease the level of uncertainty, and the more complex the web the more
information it can carry.
It appears that an advanced version of this information-selection process evolved in
psychological networks as is manifest in the phenomenon of habituation. Habituation is
the decremental response to a new repetitious stimulus without the strength of the stim-
ulus changing. After a short time humans no longer hear the background conversation
Search WWH ::




Custom Search