Cryptography Reference
In-Depth Information
The number of 1s contained in a codeword that is not "all zero" is called the
Hamming weight and is denoted w H . We can distinguish the weight relating
to the systematic part ( w H,s ) and the weight relating to the redundant part
( w H,r ) . We note, in Table 1.1, that w H is at least equal to 4. Because of
linearity, this also means that the number of bits that differ in two codewords
is also at least equal to 4. The number of bits that are different, when we
compare two binary words, is called the Hamming distance. The smallest of all
the distances between all the codewords, considered two by two, is called the
minimum Hamming distance (MHD) and denoted d min .Linearitymeansthat
it is also the smallest of the Hamming weights in the list of codewords excluding
the "all zero". d min is an essential parameter for characterizing a particular
code, since the correction capability of the corresponding decoder is directly
linked to it.
We now write a decoding law for the code of Figure 1.1:
"After receiving the word c' =( d 0 ', d 1 ', d 2 ', d 3 ', r 0 ', r 1 ', r 2 ', r 3 ') transmitted
by the encoder and possibly altered during transmission, the decoder chooses
the closest codeword c in the sense of the Hamming distance".
The job of the decoder is therefore to run through Table 1.1 and, for each of the
sixteen possible codewords, to count the number of bits that differ from c'. The
c codeword selected is the one that differs least from c'. When several solutions
are possible, the decoder selects one at random. Mathematically, this is written:
3
3
d j +
r j
c = c
∈{
c
}
such that
d j
r j
is minimum
(1.4)
j =0
j =0
A decoder capable of implementing such a process is called a maximum likelihood
(ML) decoder as all the cases, here the sixteen possible codewords, are run
through in the search for a solution and no other more e cient decoding process
exists. With codes other than the very simple Hamming code, the message to
be encoded contains far more than four bits (in practice, it goes from a few tens
to a few tens of thousands of bits) and ML decoding is impossible to execute as
the number of codewords is too high.
Assume that the Hamming encoder transmits the "all zero" word that we
have chosen as the reference and that some of the 0s have been inverted be-
fore reception, on the transmission channel . How many errors can the decoder
correct, based on law (1.4)? If c' contains a single 1 (a single error), the "all
zero" word is closer to c' than any other codeword that possesses at least four
1s. Correction is thus possible. If c' contains two 1s, for example in the first
two places ( d 0 = d 1 = 1), the decoder is faced with a dilemma: four codewords
are at a distance of 2 from the received word. It must therefore choose c at
random, from among the four possible solutions, with three risks out of four
of being erroneous. In this situation, we can also ensure that the decoder does
not give a solution but merely indicates its dilemma: it then plays the role of
Search WWH ::




Custom Search