Databases Reference
In-Depth Information
replacing the multiplication by a table lookup indexed by the quantized values of the range
and the LPS probability [ 44 ]. Given a minimum value A min and a maximum value A max for
the range A , the range is restricted to four values:
A
=
A min + (
k
+
1
)
A
,
k
=
0
,
1
,
2
,
3
where
A max
A min
A
=
4
The LPS probability q c can take on one of 64 possible values q m where
q m = α
q m 1 ,
m
=
1
,
2
, ··· ,
63
(82)
where
0
63
.
01875
0
α =
and q 0 =
0
.
5
.
5
In the update equation, instead of multiplying the range and the lower limit, the range is
mapped to the closest of the four quantized ranges, the corresponding index k along with the
index m of the probability of the LPS are used as pointers into a lookup table, and the product
is read from the table.
In order to make the coder adaptive, all we need to do is update the value of the LPS
probability as we see more data. If we see more occurrences of MPS, we should decrease
the LPS probability. If we encounter more occurrences of LPS, we should increase the LPS
probability. The M coder needs to do this while keeping the LPS probability restricted to the
64 allowed probabilities generated by Equation ( 82 ). This can be done by simply incrementing
or decrementing the index m of q m . When an MPS is encountered, we increment the index and
when an LPS is encountered, we decrement the index. The index is incremented by one each
time an MPS is encountered until it reaches the maximum allowed value where it remains.
When an LPS is encountered, the index is decremented by a variable amount until the index
reaches 0. At this point the LPS probability is one half and further occurrence of LPS indicates
that this symbol is no longer the least probable and therefore, the value of MPS is changed. In
other words, the MPS and LPS symbols can be swapped. The M coder forms the core of the
CABAC coder, which is part of the H.264 standard for video compression.
4.7 Comparison of Huffman and Arithmetic
Coding
We have described a newcoding scheme that, althoughmore complicated thanHuffman coding,
allows us to code sequences of symbols. How well this coding scheme works depends on how
it is used. Let's first try to use this code for encoding sources for which we know the Huffman
code.
Looking at Example 4.4.1 , the average length for this code is
l
=
2
×
0
.
5
+
3
×
0
.
25
+
4
×
0
.
125
+
4
×
0
.
125
(83)
=
2
.
75 bits
/
symbol
(84)
 
Search WWH ::




Custom Search