Cryptography Reference
In-Depth Information
drawback of this solution remains: only one-way communication between the
equalizer and the decoder.
Therefore, does a strategy exist that can somehow produce the best of both
worlds, capable of reconciling both good performance of the optimal joint re-
ceiver and simplicity in implementation of the sub-optimal disjoint receiver?
Today, it is possible to reply in the armative, thanks to what we have called
"turbo equalization".
11.1.4 Principle of turbo equalization
The concept of turbo equalization first saw the light of day in the laboratories
of ENST Bretagne at the beginning of the 90s, under the impulsion of the spec-
tacular results obtained with turbo codes. It was the outcome of a very simple
realization: the transmission scheme in Figure 11.8 can be seen as the serial
concatenation of two codes (Chapter 6), separated by an interleaver, the second
code being formed by cascading the mapping operation with the channel 2 . Seen
from this angle, it would then seem natural to apply a decoding strategy of the
"turbo" type at reception, that is, a reciprocal, iterative exchange of probabilis-
tic information (extrinsic information) between the equalizer and the decoder.
The first turbo equalization scheme was proposed in 1995 by Douillard et al .
[11.12]. This scheme implements a weighted input and output ( Soft Input Soft
Output , or SISO) Viterbi equalizer according to the Soft Output Viterbi Algo-
rithm (SOVA). The principle was then used in 1997 by Bauch et al. , substituting
the SOVA equalizer by a SISO equalizer that was optimal in the sense of the
MAP criterion, using the algorithm developed by Bahl et al. (BCJR algorithm
[11.7]) .
The simulation results quickly showed that the turbo equalizer was capable
of totally removing ISI, under certain conditions. Retrospectively, this excellent
performance can be explained by the fact that this transmission scheme brings
together two key ingredients which are the force of the turbo principle:
1. The implementation of iterative decoding at reception, introducing an
exchange of probabilistic information between the processing operations,
about which we today know that, when the signal to noise ratio exceeds a
certain "convergence threshold", it converges towards the performance of
the optimal joint receiver after a certain number of iterations.
2. The presence of an interleaver at transmission, whose role here mainly
involves breaking up the error packets at the output of the equalizer (to
avoid the phenomenon of error propagation), and decorrelating as far as
2 Note that, strictly speaking, transmission on a selective channel does not represent a coding
operation in itself, despite its convolutional character, as it does not provide any gain. In-
deed, it only degrades performance. Nevertheless, this analogy makes sense from the iterative
decoding point of view.
Search WWH ::




Custom Search