Databases Reference
In-Depth Information
If we indicate an outcome by first indicating which of the n groups it belongs to, and second
indicating which member of the group it is, then by our earlier development the average
information H is given by
p 1 H 1
p n H 1
1
n 1
1
n n
H
=
H
(
p 1 ,
p 2 ,...,
p n ) +
n 1 ,...,
+···+
n n ,...,
(7)
=
H
(
p 1 ,
p 2 ,...,
p n ) +
p 1 K log n 1 +
p 2 K log n 2 +···+
p n K log n n
(8)
n
=
H
(
p 1 ,
p 2 ,...,
p n ) +
K
p i log n i
(9)
i
=
1
Equating the expressions in Equations ( 6 ) and ( 9 ), we obtain
K log n j
n
=
H
(
p 1 ,
p 2 ,...,
p n ) +
K
p i log n i
i
=
1
or
K log n j
n
H
(
p 1 ,
p 2 ,...,
p n ) =
K
p i log n i
i
=
1
n
n
=−
K
p i log n i
log
n j
i
=
1
j
=
1
n
n
n
=−
K
p i log n i
log
n j
p i
(10)
i =
j =
i =
1
1
1
n
n
n
=−
K
p i log n i
p i log
n j
i
=
1
i
=
1
j
=
1
n
n
log n i
=−
K
p i
log
n j
i
=
1
j
=
1
n
n i
j = 1 n j
=−
K
p i log
(11)
i
=
1
K p i log p i
=−
(12)
where, in Equation ( 10 ) we have used the fact that i = 1 p i
=
1. By convention, we pick K to
be 1, and we have the formula
p i log p i
H
=−
Note that this formula is a natural outcome of the requirements we imposed in the beginning.
It was not artificially forced in any way. Therein lies the beauty of information theory. Like
the laws of physics, its laws are intrinsic in the nature of things. Mathematics is simply a tool
to express these relationships.
 
Search WWH ::




Custom Search