Digital Signal Processing Reference
In-Depth Information
8.7 Mutual Information
Before receiving the transmitted symbol
x
j
at the receiver, the state of knowledge
at the receiver is the probability that
x
j
would be selected for transmission. This
is known as
a-priory probability p
x
j
. After receiving the symbol
y
k
the state of
knowledge concerning
x
j
is the conditional probability
p
x
j
|
y
k
, known as
a pos-
teriori probability.
Thus before and after reception, the uncertainty is
log
p
x
j
−
y
k
, respectively.
The information gained about
x
j
by the reception of
y
k
is the net reduction in the
log
p
x
j
|
and
−
uncertainty, and is known as
mutual information I
x
j
;
y
k
.
Thus,
I
x
j
;
y
k
=
initial uncertainty-final uncertainty
log
p
x
j
−
−
log
p
x
j
|
y
k
=−
log
p
x
j
|
y
k
p
x
j
=
p
x
j
,
y
k
p
x
j
p
(8.30)
=
log
(
y
k
)
log
p
y
k
|
x
j
I
y
k
;
x
j
=
=
p
(
y
k
)
Equation (
8.30
) says, '
mutual information is symmetric'
.
Next, the average of mutual information, i.e., the entropy corresponding to
mutual information is given by
I
x
j
;
y
k
I
(
X
;
Y
)
=
m
n
p
x
j
,
y
k
I
x
j
;
y
k
=
j
=
1
k
=
1
p
x
j
,
y
k
log
p
x
j
|
y
k
p
x
j
m
n
=
j
=
1
k
=
1
m
n
p
x
j
,
y
k
log
p
x
j
|
y
k
−
log
p
x
j
=
j
=
1
k
=
1
⎡
⎤
m
n
m
n
p
x
j
,
y
k
log
p
x
j
−
p
x
j
,
y
k
log
p
x
j
|
y
k
⎣
−
⎦
=−
j
=
1
k
=
1
j
=
1
k
=
1
n
log
p
x
j
−
m
p
x
j
,
y
k
=−
H
(
X
/
Y
)
j
=
1
k
=
1
m
p
x
j
log
p
x
j
−
=−
H
(
X
/
Y
)
j
=
1
Search WWH ::
Custom Search