Cryptography Reference
In-Depth Information
Optimization of cycle size
Optimization of the irregularity profiles being asymptotic, we now have to build
a parity check matrix of finite size. This phase can be performed randomly:
we draw the non zero inputs of the parity check matrix at random, respecting
as far as possible the irregularity profile of the nodes. It is also possible to
build codes by randomly drawing permutations of an elementary matrix which
are then concatenated. Another way to build LDPC codes is the deterministic
construction of a matrix (finite and projective geometry).
In all cases, we must pay attention to the cycles present in the graph of
the code. The belief propagation decoding algorithm assumes that the cycles
that would deteriorate the independence of the messages entering a node do not
exist. In practice, the presence of cycles in the graph is inevitable, but if they are
large enough, the independence of the messages remains a good approximation.
Building good LDPC codes must therefore ensure the absence of smaller cycles,
those of size 4. Very many solutions are proposed in the literature to build
LDPC codes. For example, Campello et al. [9.9] propose optimizing the size
of the minimum cycle for a given rate. Hu et al. [9.65] suggest building the
graph branch by branch in order to avoid at maximum the lowest cycle sizes
( Progressive Edge Geometry or PEG). Zhang et al. [9.67] build LDPC codes
whose smallest cycles are size 12, 16 or 20, but the variables are only degree 2.
Tian et al. [9.57] use the fact that not all the small size cycles have the same
influence and omit only those that are the most penalizing.
Selecting the code by the impulse method
The decoding performance using the belief propagation algorithm is improved
by avoiding small sized cycles. But it is also important to have "good" error
correcting codes, that is to say, those that have a large minimum distance. The
impulse method was first proposed by Berrou et al. [9.4, 9.5] to evaluate the
minimum Hamming distance of a turbo code. It was then adapted to the case
of LDPC codes by Hu et al. [9.26]. It thus enables us to simply verify that
the minimum distance of the code designed is sucient to reach the error rate
targeted for the application required.
Selecting the code by simulation
Two codes of the same size and the same rate, built with the same irregular-
ity profiles, not having a short cycle and having the same minimum distance
can nevertheless have a fairly different performance. These differences can be
explained by two phenomena: the existence of "parasitic" fixed points intro-
duced by the sub-optimality of the iterative decoding algorithm which increase
the binary error rate in relation to the theoretical value [9.33]. The number of
codewords with minimum distance also influences the performance of the code.
Search WWH ::




Custom Search