Digital Signal Processing Reference
In-Depth Information
As equation [4.10] is written:
L X
L Y
p X ( i )
p X|Y ( i
I ( X ; Y )=
p XY ( i,j )log 2
|
j )
i =1
j =1
L X
L X
L Y
I ( X ; Y )=
p X ( i )log 2 p X ( i )+
p XY ( i,j )log 2 p X|Y ( i
|
j )
i =1
i =1
j =1
we find:
I ( X ; Y )= H ( X )
H ( X
|
Y )
The mutual information therefore measures the reduction in uncertainty of X given
the knowledge of Y . The conditional entropy H ( X
Y ) can be considered to be the
mean uncertainty of the symbol emitted by the source after the symbol produced at
the receiver has been specified. For a slightly noisy channel, the conditional entropy
is almost zero. The mutual information is therefore maximum. It is practically equal
to the source entropy. The mutual information characterizes the information transfer.
|
By developing equation [4.10], we can directly obtain:
I ( X ; Y )= H ( X )+ H ( Y )
H ( X,Y )
When Y = X ,wehave:
I ( X ; X )= H ( X )
The expression for the mutual information, showing the conditional probabilities
p Y |X ( j
i ) which characterize the channel and the source symbol probabilities, p X ( i ),
is given by:
|
L X
L Y
p Y ( j )
p Y |X ( j
I ( X ; Y )=
p X ( i ) p Y |X ( j
|
i )log 2
|
i )
i =1
j =1
As:
L X
p Y ( j )=
p X ( k ) p Y |X ( j
|
k )
k =1
we find:
L X
k =1
L X
L Y
p X ( k ) p Y |X ( j
|
k )
I ( X ; Y )=
p X ( i ) p Y |X ( j
|
i )log 2
p Y |X ( j
|
i )
i =1
j =1
Search WWH ::




Custom Search