Information Technology Reference
In-Depth Information
if X and Y are independent random variables, from the definitions of entropy, joint
entropy, conditional entropy, and random variable independence, it follows that the
entropy of Z is the sum of the entropies of its components X and Y .
According to Shannon's information theory, the entropy of probability dis-
tributions with a given variance reaches the maximum value in the normal
distribution having that variance (see Table 4.8 for a proof in the continuous
case). Therefore, the joint probability distribution of Z
= X 2
Y 2 reaches its
maximum, when both components X and Y reach their maxima. As we have
shown above, the Pythagorean recombination game transforms a distribution
in another one having the same variance, therefore if the distributions of the
two Pythagorean components of these distributions evolve toward normal dis-
tributions, then, along the game, the H function evolves toward its minimum
value (and S toward its maximum value).
+
1
2 ln
2
The entropy of a normal distribution is
(
2
π
e
σ
)
, as deduced in Table 4.7.
2
Ta b l e 4 . 7 The entropy of the Gaussian distribution N of mean 0 and variance
σ
+
S ( N )=
N ( x ) ln N ( x ) dx
x 2
+
N ( x ) ln
e
2
2πσ
=
2 dx
+
ln 2
x 2
=
N
(
x
)[
2
πσ
2
]
dx
+
2 +
dx + ln 2πσ
N ( x ) x 2
=
N ( x ) dx
2
2 + ln 2πσ
E ( x 2
)
2
=
· 1
1
1
2
=
2 +
2 ln 2πσ
1
2 ( 1 + ln 2πσ
2
=
)
1
2 ( ln e + ln ( 2πσ
2
=
))
2 (
2
=
(
)
ln
2
π e σ
The entropic divergence (or Kullback-Leibler divergence ) between two proba-
bility distributions p
,
q , denoted by D
(
p
q
)
is given by:
 
Search WWH ::




Custom Search