Cryptography Reference
In-Depth Information
We may use this example to illustrate how entropy is related to decision problems
( see page 502 ) . In other words, to determine the outcome of the event, one may
ask yes or no questions, but how many? One could ask if the number of tails is
greater than one. This narrows the field to two possibilities. Then if the answer
were yes, for instance, then the next question could be, is the number of tails
greater than three? Then we know the outcome after two questions, a number
approximately equal to the entropy.
Joint Entropy : When we have two message sources
S
=
{
s 1 ,s 2 ,...,s n }
S =
s 1 ,s 2 ,...,s n }
and
{
, the joint entropy is defined by
S )=
H (
S
,
p i,j log 2 ( p i,j ) ,
s i S
t j S
and s j is the outcome of
S .
where p i,j is probabilitythat s i is the outcome of
S
It follows that
S )
S ) ,
H (
S
,
H (
S
)+ H (
(11.2)
S ) is, at most, the information
which says that the entropy in the pair (
S
,
S ; and equality holds precisely
contained in
S
plus the information contained in
S are independent .
To illustrate this new notion, we move from coins to the more “suitable”
deck of cards.
when
S
and
Example 11.5 Suppose that a card is drawn from a standard deck of 52 cards,
S
=
{
clubs,diamonds,hearts,spades
}
=
{
s 1 ,s 2 ,s 3 ,s 4 }
with p j =1 / 4 for each
S =
s 1 ,...,s 13 }
j =1 , 2 , 3 , 4 , and
{
ace, 2 , 3 , 4 ,..., 10 ,jack,queen,king
}
=
{
,
S
with p j =1 / 13 for j =1 , 2 ,..., 13 . Thus,
S
and
are independent, which
means that
p j = p i
p j
= 1
1
13 =
1
52 ,
p i,j = p i
·
4 ·
where p i is the probability that s i is the outcome of
S
and p j is probability that
S . Then the entropy is given by
j is the outcome of
4
13
S )=
H (
S
,
p i,j log 2 ( p i,j ) = log 2 (52) ,
i =1
j =1
the maximum entropy possible as discussed above.
Conditional Entropy : Earlier we mentioned the context of cryptography
for entropy. We revisit this here in terms of ciphertext and keys, which was one
of Shannon's viewpoints when establishing the basics of Information Theory. We
maydiscuss the conditional entropy of, say, the key, k
K
, given the ciphertext,
c
C
, which is defined as follows,
H (
K | C
)=
p c,k log 2 ( p k | c ) ,
(11.3)
c C
k K
Search WWH ::




Custom Search