Cryptography Reference
In-Depth Information
Y
=
y
), we can define the conditional
entropy of the random variable
X
when given the random variable
Y
as the weighted
average of the conditional uncertainties of
X
given that
Y
=
y
:
Using the conditional entropy
H
(
X
|
Y
)=
y
H
(
X
|
P
Y
(
y
)
H
(
X
|
Y
=
y
)
P
Y
(
y
)
x
=
−
P
X|Y
=
y
(
x
)log
2
P
X|Y
=
y
(
x
)
y
P
Y
(
y
)
P
XY
(
x, y
)
P
Y
(
y
)
=
−
log
2
P
X|Y
(
x, y
)
y
x
−
P
XY
(
x, y
)log
2
P
X|Y
(
x, y
)
=
(
x,y
)
In this series of equations, the indices of the sums are written in a simplified
way. In fact,
x
is standing for
x
∈X
:
P
X|Y
=
y
(
x
)
=0,
y
is standing for
y
∈Y
:
P
Y
(
y
)
=0, and—similar to (5.2)—(
x, y
) is standing for all possible
pairs (
x, y
) with
x
or all (
x
i
,y
j
) for
i
=1
,...,n
and
j
=1
,...,m
.
Note that in contrast to the previously introduced entropies, such as
H
(
X
)=
H
(
P
X
),
H
(
XY
)=
H
(
P
XY
),or
H
(
X
∈X
and
y
∈Y
|
Y
=
y
)=
H
(
P
X|Y
=
y
), the entropy
H
(
X
Y
) is not the entropy of a specific probability distribution, but rather the
expectation of the entropies
H
(
X
|
|
Y
=
y
). It can be shown that
0
≤
H
(
X
|
Y
)
≤
H
(
X
)
with equality on the left if and only if
X
is uniquely determined by
Y
and with
equality on the right if and only if
X
and
Y
are (statistically) independent. More
precisely, it can be shown that
H
(
XY
)=
H
(
X
)+
H
(
Y
|
X
)=
H
(
Y
)+
H
(
X
|
Y
)
,
(i.e., the joint entropy of
X
and
Y
is equal to the entropy of
X
plus the entropy of
Y
given
X
, or the entropy of
Y
plus the entropy of
X
given
Y
). This equation
is sometimes referred to as
chain rule
and can be used repeatedly to expand
H
(
X
1
···
X
n
) as
H
(
X
1
···
X
n
)=
H
(
X
1
)+
H
(
X
2
|
X
1
)+
...
+
H
(
X
n
|
X
1
···
X
n−
1
)