Cryptography Reference
In-Depth Information
extended Hamming code according to law (1.4) is no longer optimal. The law of
maximum likelihood decoding to implement in order to exploit these weighted
values depends on the type of noise. An important case in practice is additive
white Gaussian noise (AWGN).
u is a random Gaussian variable with mean μ and variance σ 2
when its
probability density p ( u ) can be expressed in the form:
μ ) 2
2 σ 2
1
σ 2 π exp(
( u
p ( u )=
)
(1.8)
The AWGN is a perturbation which, after adapted filtering and periodic sam-
pling (see Chapter 2), produces independent samples whose amplitude follows
probability density law (1.8), with zero mean and variance:
N 0
2
σ 2 =
(1.9)
where N 0 is the noise power spectral density.
A transmission channel on which the only alteration of the signal comes from
an AWGN is called a Gaussian channel . At the output of such a channel, the
ML decoding is based on the exhaustive search for the codeword that is at the
smallest Euclidean distance from the received word. Denoting X and Y the
received values corresponding to the transmitted symbols x and y respectively,
the soft input decoder of the extended Hamming code therefore chooses:
3
3
X j ) 2 +
Y j ) 2 is minimum
c = c
∈{
c
}
such that
( x j
( y j
(1.10)
j =0
j =0
Since the values transmitted are all such that x j
=1 or y j
=1 and all the
Euclidean distances contain X j
and Y j , the previous law can be simplified as:
3
3
c = c
∈{
c
}
such that
2 x j X j
2 y j Y j is minimum
j =0
j =0
or as:
3
3
c = c
∈{
c
}
such that
x j X j +
y j Y j is maximum
(1.11)
j =0
j =0
Minimizing the Euclidean distance between two codewords c and c' therefore
j =0
3
j =0
3
means maximizing the scalar product
x , X
+
y , Y
=
x j X j +
y j Y j
where x, X, y and Y represent the transmitted and received sequences of the
systematic and redundant parts.
Search WWH ::




Custom Search