Databases Reference
In-Depth Information
Then we can obtain the entropy of this random variable as
H
(
X d ) =−
P
(
x i )
log P
(
x i )
(30)
i
=−∞
=−
f X (
x i )
log f X (
x i )
(31)
i
=−∞
=−
f X (
x i )
log f X (
x i )
f X (
x i )
log
(32)
i
=−∞
i
=−∞
f X (
x i )
=−
x i )
log f X (
log
(33)
i
=−∞
−∞
Taking the limit as
dx ,
which looks like the analog to our definition of entropy for discrete sources. However, the
second term is
0 of Equation ( 33 ), the first term goes to
f X (
x
)
log f X (
x
)
goes to zero. It seems there is not
an analog to entropy as defined for discrete sources. However, the first term in the limit serves
some functions similar to that served by entropy in the discrete case and is a useful function
in its own right. We call this term the differential entropy of a continuous source and denote
it by h
log
, which goes to plus infinity when
(
X
)
. In an analogous manner we can also define the conditional entropy h
(
X
|
Y
)
as
h
(
X
|
Y
) =−
f XY (
x
,
y
)
log f X | Y (
x
|
y
)
dxdy
−∞
Example8.4.4:
Suppose we have a random variable X that is uniformly distributed in the interval
[
a
,
b
)
.The
differential entropy of this random variable is given by
h
(
X
) =−
f X (
x
)
log f X (
x
)
dx
(34)
−∞
b
1
1
=−
a log
a dx
(35)
b
b
a
=
log
(
b
a
)
(36)
Notice that when b
a is less than one, the differential entropy becomes negative—in contrast
to the entropy, which never takes on negative values.
Later in this chapter, we will find particular use for the differential entropy of the Gaussian
source.
Search WWH ::




Custom Search