Cryptography Reference
In-Depth Information
Probability of erroneous decoding of a codeword
For a linear block code
C
(
n, k
)
of error correction capability
t
, the codeword
transmitted will be wrongly decoded if there are
t
+
j
errors,
j
=1
,
2
,
•
−
t
, in the received word
r
. For a binary symmetric channel of probability
p
,
the probability
P
e,
word
of performing an erroneous decoding of the transmitted
codeword is upper bounded by:
···
,n
n
j
p
j
(1
n
p
)
n−j
P
e,
word
<
−
(4.24)
j
=
t
+1
We can also determine the binary error probability
P
e,
bit
on the information data
after decoding. In presence of erroneous decoding, the maximum
a posteriori
likelihood decoder adds at most
t
errors by choosing the codeword with the
minimum distance from the received word. The error probability is therefore
bounded by:
(
j
+
t
)
n
j
p
j
(1
n
1
n
p
)
n−j
P
e,
bit
<
−
(4.25)
j
=
t
+1
If the transmission is performed with binary phase modulation (2-PSK, 4-PSK),
probability
p
is equal to:
2
erfc
RE
b
1
p
=
N
0
where
R
is the coding rate,
E
b
the energy received per transmitted information
bit and
N
0
the unilateral power spectral density of the noise. Figure 4.5 shows
the binary error probability and word error probability after algebraic decoding
for the (15,7) BCH code. The modulation is 4-PSK and the channel is Gaussian.
The higher bounds expressed by (4.24) and (4.25) respectively are also plotted.
Soft decoding
Considering a channel with additive white Gaussian noise and binary phase mod-
ulation transmission (2-PSK or 4-PSK), the components
r
j
,
j
=0
,
1
,
···
,n
−
1
of the received word
r
have the form:
r
j
=
E
s
c
j
+
b
j
,
c
j
=2
c
j
−
1
where
c
j
=0
,
1
is the symbol in position
j
of codeword
c
,
c
j
is the binary
symbol associated with
c
j
,
E
s
is the energy received per transmitted symbol
and
b
j
is white Gaussian noise, with zero mean and variance equal to
σ
b
.
Maximum a posteriori likelihood decoding
Decoding using the maximum
a posteriori
likelihood criterion means search-
ing for codeword
c
such that:
•
c
|
=
c
∈
c
=
c
⇔
Pr
{
c
|
r
}
>
Pr
{
r
}
,
∀
c
C
(
n, k
)