Information Technology Reference
In-Depth Information
{
,
,
,
,
,
,
,
,
}
a 1
a 2
a 3
a 4
a 5
a 6
a 7
a 8
a 9
0
1
{
{
a 1 ,
a 1 ,
a 2 ,
a 2 ,
a 3 ,
a 3 ,
a 4 }
a 4 }
{
{
a 5 ,
a 5 ,
a 6 ,
a 6 ,
a 7 ,
a 7 ,
a 8 ,
a 8 ,
a 9 }
a 9 }
0
1
0
1
{
a 1 ,
a 2 ,
a 3 }
{
a 4 }
{
{
a 5 ,
a 5 ,
a 6 ,
a 6 ,
a 7 }
a 7 } {
a 8
,
a 9
}
01
0
1
0
1
0
1
{
a 1 ,
a 2 }
{
a 3 }
{
a 5 }
{
a 6 ,
a 7 }
{
a 8 }
{
a 9 }
100
110
111
001
01
0
1
a 6 } {
a 7 }
} {
a 2 }
{
{
a 1
0000
0001
1010
1011
Fig. 6.7 An encoding tree
information source. Let us recall briefly the Huffman method. Given k data with
k
>
1 and with probabilities p 1 ,
p 2 ,...,
p k respectively, let us define the following
process of tree construction.
1. Consider two probabilities p 1 ,
p 2 such that no probability is smaller than them,
then introduce a new probability p 1 +
p 2 and connect this new probability with
their summands and label these edges with 0 and, 1 respectively.
2. Apply again the procedure of the previous step to a set of probabilities where
p 1 ,
p 2 (see Fig. 6.8), and iterate this step until only a
unique probability is obtained (at each step the set of probabilities decreases by
one unit, see Fig. 6.9).
3. Associate to the datum of probability p j the sequence of labels of the edges
connecting the probability p j with the final unique probability.
p 2 are replaced by p 1 +
Fig. 6.8 The initial step of an Huffmann code
 
Search WWH ::




Custom Search