Cryptography Reference
In-Depth Information
11.2 Entropy
Heretics are the only bitter remedy against the entropy of human thought.
— see [298, introduction]
Yevgeny Zamyatin (1884-1937), Russian writer
Information Theoryis concerned with sending messages via electronic sig-
nals in the most eLcient and error-free manner. Shannon defined information
to mean a measure of one's freedom of choice when one selects a message . This
“measure” will gain mathematical precision below. The idea is that information
refers to the degree of uncertaintythat exists in the situation at hand. There-
fore, in this (Information Theory) sense of the word, any situation that is totally
predictable (namely, whose outcome is certain) has very little information (per-
haps none). Thus, redundancy adds little, if any, information. Redundancy,
such as in the repeating of a message, helps to eliminate noise in a communica-
tions system. Think of “noise” as anything within the communications system
that is contraryto the predictabilityof the outcome of that sstem. 11.1 The
term entropy is the degree of randomness (or uncertainty) in a given situation,
measured in bits, to which we will give a mathematical flavour below. In other
words, entropyis a measure of the amount of information in a given message
source. Moreover, in Information Theory, “e L ciency” refers to the bits of data
per second that can be sent and received, and “accuracy” (error-freeness) is the
extent to which transmitted data can be understood (meaning clarityof recep-
tion, wherein the message maynot have “meaning”). When we put the above
into a cryptological situation, where intended plaintext messages do have mean-
ing, encryption may be seen as noise added to the cryptosystem. The entropy
is the measure of the uncertaintyabout a message before it leaves the message
source. Now we look at all of this from a mathematical viewpoint.
Properties of Entropy
Shannon required that entropymust satisfythe following properties.
1. H must be a continuous function of the variables p 1 ,p 2 ,...,p n , the probabil-
itydistribution. In this wa, a small change in the probabilitydistribution
should not severelyalter the uncertaint.
2. When all messages are equallylikel, that is, when p ( m j )=1 /n for all
j =1 , 2 ,...,n , H should be an increasing function of n . In other words,
H n 1 ,...,n 1
H ( n +1) 1 ,..., ( n +1) 1 for all n
1 ,
11.1 In 1948, Shannon was able to precisely determine the maximum data rate achievable over
any transmission channel involving noise. Today this is known as Shannon's theorem , which
says that
K = B · log 2 (1 + S/N ) ,
where K is the effective limit on the channel's capacity measured in bits per second, B is
the bandwidth of the hardware, S is the average signal strength, and N is the average noise
strength. S/N is called the signal-to-noise ratio . Shannon's theorem places a fundamental
limit on the number of bits per second that can be transmitted over a channel. Thus, no
amount of engineering innovation will overcome this basic physical law.
Search WWH ::




Custom Search