Cryptography Reference
In-Depth Information
Example 11.1 If
consists of the toss of a coin with equal outcomes, and we
encode heads as 1 and tails as 0 , then p 1 = p 2 =1 / 2 with
S
S
=
{
0 , 1
}
, and
H (
S
) = 1log 2 (1 / 2) / 2
1 / log 2 (1 / 2) = 1 bit .
In general, if
S
has cardinality n such that each message has probability 1 /n ,
then H (
S
) = log 2 ( n ) .
) may
assume . Moreover, this maximum is assumed if and onlyif the probability
distribution is uniform, for example, when p j =1 /n for all j
In general, if n =
| S |
, log 2 ( n )is the maximum possible value that H (
S
1. For instance,
we have the following.
Example 11.2 Suppose we encode N equally likely, independent coin tosses as
bitstrings of length N , then the cardinality of
S
is n =2 N and p j =2 N for
each j for which the entropy is given by
2 N
2 N log 2 2 N =
2 N (2 N (
H (
S
)=
N )) = N = log 2 ( n ) .
j =1
Thus, we learn N bits when we are given a bitstring of length N .
) maybe viewed as the measure of the
number of bits of information obtained when we know the outcome of
Example 11.2 illustrates that H (
S
. This
example also illustrates the fact that entropymeasures the minimal amount of
bits required to represent an event on a computer, and then onlythe relevant
bits pertaining to the uncertainty.
For the next illustration, we go back to page 208, where we talked about
Alice and Bob flipping coins bytelephone. Let us look at the entropyof that
protocol (event).
S
Example 11.3 In this case
where 0 is the encoding of tails and 1 is
the encoding of heads. Then as in Example 11.1, H (
S
=
{
0 , 1
}
)=1 bit. In other words,
Alice is a source of only 1 bit in this coin flipping protocol, irrespective of the
size of the integer x . The reason is that Bob knows x so there is no information
there for him. There is, for Bob, uncertainty ( information ) only in its parity,
which is Alice's guess; hence her output: 1 bit of entropy.
S
We have considered onlyuniform probabilitydistributions thus far. We now
look at other scenarios.
Example 11.4 Suppose that we flip three coins, which are not fair, where the
outcome is the number of tails. Then
, and we let the probabilities
be p 0 =1 / 8 , p 1 =3 / 8 , p 2 =3 / 8 , and p 3 =1 / 8 . Thus, the entropy is
S
{
}
=
0 , 1 , 2 , 3
1
8 log 2 (1 / 8)
3
8 log 2 (3 / 8)
3
8 log 2 (3 / 8)
1
8 log 2 (1 / 8)
H (
S
)=
−−
1 . 81 .
Search WWH ::




Custom Search