Digital Signal Processing Reference
In-Depth Information
The selection of symbols chosen by the information source in the generation of the infor-
mation represents a source of uncertainty for the receiver in the sense of unpredictability.
This is only removed when the receiver recognises the message.
Thus the aim and the result of communication processes is the
resolution of uncertainty.
In order to measure the amount of information a yardstick is necessary which increases
with the extent of the freedom of decision of the source. At the same time the uncertainty
of the recipient increases as to what message the source will produce and transmit. This
yardstick for the quantity of information is called entropy H. It is measured in bits. The
information flow R is the number of bits/s.
How can the transmission capacity of a distorted or imperfect channel be described? The
information source should be selected in such a way that the information flow of of a
given channel is as great as possible. SHANNON calls this maximum information flow
“channel capacity” C. SHANNON's main thesis on the distorted transmission channel is:
A (discrete) channel has a channel capacity C and a (discrete)
source with the entropy H is connected with it. If H is smaller than
C there is a encoding system by means of which the information
from the source can be transmitted via the channel with a minimal
error frequency (bit error likelihood).
This result was noted by other scientists and engineers with astonishment. Information
can be transmitted via distorted channels with the desired level of reliability!?.
If the probability of transmission errors increases, that is, errors occur more frequently,
the channel capacity C as defined by SHANNON becomes smaller. It decreases the more
frequently errors occur. Thus the information flow must be reduced to such an extent that
it is smaller or at most equal to the channel capacity C.
How can this be achieved? SHANNON “only” proved the existence of this cut off value
- he didn't say how to achieve this. Modern encoding processes are approaching this ideal
more and more. This can be seen in the development to date of modem technology. If 20
years ago the data rate which could be achieved via a 3 kHz telephone channel was 2.4
kbit/s today it is more than 56 kbit/s!
It now seems clear why source encoding (including entropy encoding) as a method of
compression should be dealt with quite separately from (error protection) channel
encoding. In source encoding as much redundancy as possible should be removed from
the signal in order to get as close as possible to the channel capacity C of the undisturbed
signal. If the channel is affected by interference exactly the amount of redundancy can be
added to get as close as possible to the (smaller) channel capacity of the disturbed channel.
Search WWH ::




Custom Search