Digital Signal Processing Reference
In-Depth Information
Fig. 9 Illustration of
one-to-one input to output
mapping of uniform
quantization
Output Y
y 2
y 1
Input X
x 3
x 2
x 1
x 1
x 2
x 3
y 1
Q
(Quantization Step)
y 2
2.4
Entropy Coding
The last step of video coding is entropy coding. It translates symbols into bitstream.
Symbols can be the transformed and quantized residual values, the MVs, or the
prediction modes. From information theory, we know that the average length of
the code is bounded by the entropy of the information source. It is why prediction,
transformation, and quantization techniques are adopted in video coding systems to
reduce the entropy of the encoded data. The entropy H is defined as
=
H
P
(
A i )
log 2 P
(
A i )
(5)
A i is the i-th symbol and P
is its probability. In addition, the optimal code length
of a symbol is its own entropy
(
A i )
log 2 P . Therefore, we have to build a probability
model for each kind of symbol. For frequently appeared symbols, shorter codewords
are assigned. As a result, the final bitstream is shorter.
Huffman coding is a famous entropy coding scheme. An example of Huffman
coding is shown in Fig. 10 . At the start, the symbols are sorted by their probability
values from low to high. At each step, we choose the two symbols with lowest
probability. In this example, symbol A and symbol D are chosen first and combined
to a new symbol
22. The symbol A with lower probability
is put in the left side and assigned a bit 0. The symbol D with higher probability
is put in the right side and assigned a bit 1. Then, symbol
{
A
,
D
}
with probability 0
.
{
A
,
D
}
and symbol E
are chosen to form a symbol
}
is assigned a bit 0 and symbol E is assigned a bit 1. With the process, we can
finally build a codeword tree as shown in Fig. 10 b . The final codeword of each
symbol is assigned by traversing the tree from the root to the leaf. For example, the
codeword of symbol D is 001. To encode BABCE , the bitstream is 11000111001.
For Huffman coding, the codeword length is always an integer number. Therefore,
{{
A
,
D
},
E
}
with probability 0
.
45. Symbol
{
A
,
D
 
 
 
Search WWH ::




Custom Search