Cryptography Reference
In-Depth Information
where
p
c,k
is the probabilitythat the outcome of
is
k
, whereas
p
k
|
c
is the conditional probabilitythat
k
occurs, given that
c
occurs.
11.4
Of course conditional entropymaybe invoked with anymessage source, not
just cryptologic. An important propertyof conditional entropyis the following,
which marries the notions of joint and conditional entropy.
C
is
c
, and of
K
The Chain Rule
: The joint entropyand the conditional entropyare given
bythe following
Chain Rule
, where
S
is one message source, and
S
is another:
S
)=
H
(
S
|
S
H
(
S
,
S
)+
H
(
)
.
(11.4)
S
) is the uncer-
The Chain Rule tells us that the joint uncertaintyof pair (
S
,
S
given that
taintyof
S
plus the uncertaintyof
S
is known.
S
are message sources, then their
mutual
Mutual Information
:If
S
and
S
reduced when
information
is the uncertaintyof
S
is known:
S
,
S
)
S
|
S
I
(
S
)=
H
(
−
H
(
)
.
S
,
S
that is
Thus,
I
(
S
) measures the amount of information learned about
obtained bylearning
. The following material tells us both that mutual infor-
mation is nonnegative and gives us criteria for when it is zero.
S
The Role of Conditional Entropy
: Perhaps one of the most important
facts from Information Theoryis the following inequality:
S
|
S
S
)
,
H
(
)
≤
H
(
(11.5)
S
when we know
which tells us that the uncertaintyabout
S
is no greater
S
. As we have seen above, when the events are
independent, equalityholds (and it can be shown that equalitycannot hold
otherwise). We maydeduce that
than the uncertaintyabout
S
can onlyyield information about
S
, namely,
S
. Incidentally, it is clear that
Equation (11.5) maybe deduced from Equations (11.2) and (11.4).
knowing
S
cannot increase our certaintyabout
S
are independent, the following
The Role of Independence
: When
S
and
are equivalent facts.
S
)=
H
(
S
).
1.
H
(
S
,
S
)+
H
(
S
)=
H
(
S
|
S
2.
H
(
).
S
|
S
).
3.
H
(
S
)=
H
(
S
,
4.
I
(
S
)=0.
11.4
From the results in Appendix E, we know that
p
k
|
c
=
p
c,k
/p
c
.
Search WWH ::
Custom Search