Cryptography Reference
In-Depth Information
The first equality in (3.4) defined I ( X ; Y ) as the logarithmic increase of the
probability of X that results on average from the data Y , that is, the average
quantity of information that the knowledge of Y provides about that of X .The
second equality in (3.4), deduced from the first using (3.5), shows that this value
is symmetric in X and in Y . The quantity of information that Y provides about
X is therefore equal to what X provides about Y ,whichjustifiesthenameof
mutual information.
Mutual information is not sucient to characterize the channel because the
former also depends on the entropy of the source, that is, the quantity of in-
formation that it produces on average per emitted symbol. Entropy, that is,
in practice the average number of bits necessary to represent each symbol, is
defined by:
H ( X )=
X
Pr ( X )log 2 (Pr ( X ))
The capacity of a channel is defined as the maximum of the mutual informa-
tion of its input and output random variables with respect to all the possible
probability distributions of the input variables, and it could be demonstrated
that this maximum is reached for a symmetric memoryless channel when the
input variable of the channel, X , has equiprobable values (which also causes the
entropy of the source to be maximum). For example, for the binary symmetric
channel, the capacity is given by:
C =1+ p log 2 ( p )+(1
p )log 2 (1
p ) (Sh)
(3.6)
This capacity is maximum for p =0 (then it equals 1 Sh, like the entropy of the
source: the channel is then "transparent") and null for p =1 / 2 ,whichiswhat
we could expect since then there is total incertitude.
3.1.3 Overview of the fundamental coding theorem
The simplest code that we can imagine is the repetition code that involves
emitting information bits in the form of several identical symbols. Hard decoding
is performed according to the principle of a majority vote, and soft decoding by
the simple addition of the samples received. If the channel is Gaussian, repetition
coding provides no gain in the case of soft decoding. For example, transmitting
the same symbol twice, allocating each of them half of the available energy and
then reconstituting the emitted symbol by addition does not give a better result
than transmitting a single symbol with all the energy available. As for the
majority vote, it can only be envisaged from a triple emission and in all cases,
on a Gaussian channel this procedure degrades the budget link in relation to
the non-coded solution. It should however be noted that repeating messages is
a widespread technique, not as a procedure for error correction coding, but as a
technique for recovering packets of erroneous messages or messages lost during
transmission. This technique called ARQ (Automatic Repeat reQuest) cannot
Search WWH ::




Custom Search