Databases Reference
In-Depth Information
The number of bits per symbol can be bounded as
S ( n ) )
n
S ( n ) )
n
(
(
H
H
1
n
<
+
R
In order to compare this to ( 5 ), and see the advantage we get from encoding symbols in
blocks instead of one at a time, we need to express H
S ( n ) )
(
in terms of H
(
S
)
. This turns out
to be a relatively easy (although somewhat messy) thing to do.
m
i 2 = 1 ...
m
m
S ( n ) ) =−
H
(
P
(
a i 1 ,
a i 2 ,...
a i n )
log
[
P
(
a i 1 ,
a i 2 ,...
a i n ) ]
i 1 = 1
i n = 1
m
m
m
=−
1 ...
P
(
a i 1 )
P
(
a i 2 )...
P
(
a i n )
log
[
P
(
a i 1 )
P
(
a i 2 )...
P
(
a i n ) ]
i 1 =
1
i 2 =
i n =
1
m
m
m
n
=−
1 ...
(
a i 1 )
(
a i 2 )...
(
a i n )
[
(
a i j ) ]
P
P
P
log
P
i 1 =
1
i 2 =
i n =
1
j
=
1
m
m
m
=−
P
(
a i 1 )
log
[
P
(
a i 1 ) ]
1 ...
P
(
a i 2 )...
P
(
a i n )
i 1 =
1
i 2 =
i n =
1
m
m
m
m
P
(
a i 2 )
log
[
P
(
a i 2 ) ]
1 ...
P
(
a i 1 )
P
(
a i 3 )...
P
(
a i n )
i 2 =
1
i 1 =
1
i 3 =
i n =
1
.
.
.
m
m
m
m
P
(
a i n )
log
[
P
(
a i n ) ]
1 ...
P
(
a i 1 )
P
(
a i 2 )...
P
(
a i n 1 )
i n =
1
i 1 =
1
i 2 =
i n 1 =
1
The n
1 summations in braces in each term sum to one. Therefore,
m
m
m
S ( n ) ) =−
H
(
P
(
a i 1 )
log
[
P
(
a i 1 ) ]−
P
(
a i 2 )
log
[
P
(
a i 2 ) ]−···−
P
(
a i n )
log
[
P
(
a i n ) ]
i 1 =
1
i 2 =
1
i n =
1
=
nH
(
S
)
and we can write ( 6 )as
1
n
H
(
S
)
R
H
(
S
) +
(7)
By comparing this to ( 5 ), we can see that encoding the output of the source in longer blocks
of symbols guarantees a rate closer to the entropy. Note that all we are talking about here is a
bound or guarantee about the rate. As we have seen in the previous chapter, there are a number
of situations in which we can achieve a rate equal to the entropy with a block length of one!
Search WWH ::




Custom Search