Digital Signal Processing Reference
In-Depth Information
Residual bit-error rate of convolutional codes of rate 1/2 in QSPK modulation
for increased length k of the code used (every input bit ist "smeared" over k cycles)
Signal-/noisepower Eb/No in dB
Illustration 256: The efficiency of convolutional codes
The efficiency of error correction increases as can be expected with the length of influence of the encoder
used. It is determined by the storage steps of the shift register. The larger these are the longer the input bit
influences its predecessors and successors. The input information is “spread” more widely and therefore
protected more effectively against individual bit errors.
K = 5 already implies considerable error protection and a much lower bit error rate (BER) compared with
non-coded signals. However, the amount of calculation increases steeply for the VITERBI decoder with
increasing influence length.
For an optimal decision strategy the probablility of receiving a distorted signal should be
known (see Illustration 255). In practice a choice of 8 decision levels has proved to be
appropriate.
Channel capacity
The information theory has a definite date of birth marked by the famous “encoding
theorems” of Claude SHANNON from the year 1948. Before anyone could have foreseen
the present development of digital signal processing he recognised the fundamental prob-
lems and provided brilliant, definitive solutions at the same time. It took decades for his
work to be understood properly and put to practical use. His contributions are among the
most important scientific insights of the last century, a fact which is, however, recognised
by very few people. From a sociological, technological and scientific point of view his
work overshadows everything else. Read the theses at the end of Chapter 1 again if you
do not believe this. For this reason it is appropriate to look at his fundamental ideas with
reference to the information channel.
Search WWH ::




Custom Search