Digital Signal Processing Reference
In-Depth Information
Since:
2 −− log 2 p X ( i )
2 log 2 p X ( i )
L X
L X
2 −l ( c i )
p X ( i )
i =1
i =1
L X
2 −l ( c i )
1
i =1
from this we deduce that an instantaneous code exists which satisfies the prefix
condition.
4.2.4.2. Huffman algorithm
The algorithm proposed by Huffman in 1951 involves the progressive construction
of a binary tree starting with the terminal nodes:
- start from the two lists
;
- select the two least probable symbols, create two tree branches, and label them
with the binary symbols 0 and 1;
- the two lists are updated by bringing together the two symbols used previously
and a new symbol, which is given a probability which is the sum of the probabilities
for the two previously selected symbols;
- the two preceding steps are repeated until no more symbols remain in the list.
{
x 1 ···
x L X }
and
{
p X (1)
···
p X ( L X )
}
We can show that this algorithm is the optimal one. Any other uniquely decodable
code will have a lower mean word length to code produced via this algorithm.
4.2.4.3. Example 1
Take the example of a source which can only take six different values and assume
that their probabilities are known. These are given in Table 4.2. The entropy of this
source is 2.06 bits. The Shannon code has a mean length of 2.28 bits.
Symbols x 1 x 2 x 3 x 4 x 5 x 6
Probabilities 0.5 0.15 0.17 0.08 0.06 0.04
Tab l e 4 . 2 . Probabilities associated with six events {X ( n )= x i }
The Huffman algorithm provides the binary tree shown in Figure 4.3. The coding
table is presented in Table 4.1. The mean length is 2.1 bits, very close to the theoretical
limit.
Search WWH ::




Custom Search