Digital Signal Processing Reference
In-Depth Information
It is clear that, the amount of information is
log 2 2 3
I
=−
=
3
(8.2)
Next, let's consider another case as shown in Fig. 8.1 b. There, the probability of B
is 4/8, i.e., 1 / 2 . And t is obvious that, the minimum number of questions to be asked
is 1. Here also, probability is given by
1
2 =
2 1
P
=
(8.3)
and it is clear that, the amount of information is
log 2 2 1
I
=−
=
1
(8.4)
Combining Eqs. ( 8.2 ) and ( 8.4 ), we get the measure of information as
1
P
I
=−
log 2 ( P )
=
log 2
(8.5)
The concept of information and measure of information in accordance with the
views of psychologists perfectly matches with the conjecture of Shanon in his
revolutionary paper on 'A theory of communication' published in the Bell System
Technical Journal , 1948 [ 1 ]. There also he defined the information logarithmically
and average information as
p i I i , i.e.,
p i log p i .
i
i
Information theory deals with three basic concepts:
(i) The measure of source information: Evaluation of the rate at which source
generates information.
(ii) The information capacity of channel: It determines the maximum rate at which
reliable information transmission is possible over a given channel with an
arbitrary small error.
(iii) Coding: A scheme for efficient utilization of the information capacity of the
channel.
The combination of the above three concepts forms plenty of theorems and forms
the entire discipline namely, information theory . In summary,
If the rate of information from a source does not exceed the capacity of a given commu-
nication channel, then there exists a coding technique such that the information can be
transmitted over the channel with arbitrary small errors, despite the presence of noise.
This implies that with the help of suitable coding scheme, it is possible to transmit
signals from the source over a noisy channel almost error free.
 
Search WWH ::




Custom Search