Cryptography Reference
In-Depth Information
APPENDIX II
Information Theory
Information theory is closely related to cryptography. Cryptanalysts use results obtained
by information theorists to help them crack ciphers, and cryptographers use similar
results when crafting cryptosystems and choosing keys. Information theory provides tools
that allow us to measure the amount of information in a message. Cryptographers attempt
to keep this information to a minimum, while cryptographers exploit this tiny amount of
information to help them determine a probable plaintext for a given ciphertext.
AII.1
ENTROPY OF A MESSAGE
If we define the amount of information in a message as the minimum number of bits (includ-
ing fractions of a bit!) needed to encode all possible meanings of the message, we can obtain
a measure of that information. For example, suppose we are looking at the following bit
stream message
1010011010001010101000001010100010001010100110101000010010
00101010100100
which we know indicates a month of the year. Regardless of the actual length of the mes-
sage, we could say that the message contains only about 3 or 4 bits of information, since it
only takes that many bits to code up all possible months. (See Table A2.1.)
We define the entropy E ( M ) of a message M as
E
) = log 2 n
where n is the number of possible meanings of M , where each meaning is equally likely. Thus
the entropy of a message M signifying the month is
E
M
(
M
(
) = log 2 12
3.5849625007211561814537389439478.
Search WWH ::




Custom Search