Information Technology Reference
In-Depth Information
7.7.1 Introduction to information theory
In 1948, Shannon proposed and developed information theory, studying
information and mathematic measure, which measures the magnitude of
information through eliminating the uncertain degree of various symbols in
information source. A series of concepts has been proposed:
(1) Self-information. Before receive
a i , uncertainty of receiver send
a
i to
information source which is defined as self-information
I
(
a i ) of information
symbol
a i . i.e.
I
(
a i ) = -log
p
(
a i ), where
p
(
a i ) represents probability of
a i sent by
information source.
(2) Information entropy. Self-information reflects uncertainty of symbols, while
information entropy can be used to measure uncertainty of the whole information
source
X
. It is defined as follows
>>
H X
(
)
=
p a
(
) (
I a
)
+
p a
(
) (
I a
)
+
p a
(
) (
I a
)
1
1
2
2
r
r
r
= − Ã
p a
(
) log
p a
(
)
(7.22)
i
i
i
=
1
Where
.
Information entropy is defined as average self-information content provided by
information source when it sends a symbol. Here log is logarithm taking 2 as
bottom.
(3) Condition entropy. Condition entropy
r
represents all possible number of symbol of information source
X
H
(
X/Y
) is used to measure receiver
receiving the random variable
Y
, random variable
X
still exists uncertainty when
information source
X
and random variable
Y
are not mutual independent. Let
X
be correspondent to source symbol
a i ,
Y
be correspondent to source symbol
b j ,
then
a i /b j ) is probability, the condition entropy as follows:
(4) Average mutual information. It is used to represent the amount of information
about
p
(
X
provided by signal
Y
, represented as
I
(
X,Y
):
r
s
= − ÃÃ
H X
(
/
Y
)
p a
(
/
b
) log
p a
(
/
b
)
(7.23)
i
j
i
j
i
=
1
j
=
1
I
(
X
,
Y
)
=
H
(
X
)
H
(
X
/
Y
)
(7.24)
Search WWH ::




Custom Search