Cryptography Reference
In-Depth Information
This is the asymptotic form of the Gilbert-Varshamov bound that links the
minimum distance d of the code having the greatest minimum distance possible,
given the parameters k and n . It is a lower bound but, in its asymptotic form,
it is very close to equality. A code whose minimum distance verifies this bound
with equality is considered to be good for the minimum distance criterion. This
shows that a code built with a weight distribution close to that of random coding
is also good for this criterion.
3.2
Theoretical limits to performance
3.2.1 Binary input and real output channel
Only the case of the binary symmetric channel, with constant error probability p ,
has been considered so far. Instead of admitting a constant error probability, we
can consider that the error probability in fact varies from one symbol to another
because the noise sample that affects the received value varies randomly. Thus,
in the presence of Gaussian noise, the value leaving the optimal demodulator
is a Gaussian random variable whose sign represents the optimal decision. We
will consider the channel that has this real random variable as its output value,
that we denote a . It can be shown that this value is linked to the optimal
decision x , that is, to the best hypothesis concerning the emitted bit x ,andto
the "instantaneous" error probability p a , according to the relation:
1) x ln 1
p a
p a
a =
(
(3.10)
which means, assuming p a lower than 1 / 2 :
1
1
p a =
1) x a )+1 =
(3.11)
exp (
(
exp (
|
a
|
)+1
We mean by instantaneous error probability the error probability p a that affects
the received symbol when the real value measured at the output of the channel is
a . The inequality p a < 1 / 2 makes ln 1 −p a
p a
positive and then the best decision
is x =1 when a is positive and x =0 when a is negative. In addition, the
absolute value
is a decreasing function of the error probability
of the decision, and it therefore measures its reliability. It is null for p a =1 / 2 and
tends towards infinity when error probability p a tends towards 0 (the decision
then becomes absolutely reliable). The real quantity that (3.10) defines is called
relative value
=ln 1 −p a
p a
|
a
|
or more often log likelihood ratio (LLR) of the corresponding
binary symbol.
The capacity of the channel thus defined can be calculated as the maximum
with respect to X of the mutual information I ( X ; Y ) , defined by generalizing
(3.4) to real Y = a . This generalization is possible but the expression of the
Search WWH ::




Custom Search