Databases Reference
In-Depth Information
Example8.4.5:
Suppose we have a random variable X that has a Gaussian pdf ,
2
1
2 exp (
x
μ)
f X (
x
) =
2
(37)
2
2
σ
πσ
The differential entropy is given by
log
dx
2
2
1
2 exp (
x
μ)
1
2 exp (
x
μ)
h
(
X
) =−
2
2
(38)
2
2
2
σ
2
σ
πσ
πσ
−∞
2
1
(
x
μ)
=−
log
2
f X (
x
)
dx
+
log ef X (
x
)
dx
(39)
2
2
σ
2
πσ
−∞
−∞
1
2 log 2
1
2 log e
2
=
πσ
+
(40)
1
2 log 2
2
=
π
e
σ
(41)
Thus, the differential entropy of a Gaussian random variable is an increasing function of its
variance.
The differential entropy for the Gaussian distribution has the added distinction that it is
larger than the differential entropy for any other continuously distributed random variable with
the same variance. That is, for any random variable X , with variance
2
σ
1
2 log 2
2
(
)
π
σ
(42)
h
X
e
The proof of this statement depends on the fact that for any two continuous distributions
f X (
X
)
and g X (
X
)
f X (
x
)
log f X (
x
)
dx
f X (
x
)
log g X (
x
)
dx
(43)
−∞
−∞
We will not prove Equation ( 43 ) here, but you may refer to [ 110 ] for a simple proof. To obtain
Equation ( 42 ), we substitute the expression for the Gaussian distribution for g X (
)
. Noting
that the left-hand side of Equation ( 43 ) is simply the differential entropy of the random variable
X ,wehave
x
2
exp (
μ)
1
2
x
(
)
f X (
)
h
X
x
log
dx
2
2
σ
πσ
2
−∞
log e
−∞
2
1
2 log
) (
x
μ)
2
=
(
2
πσ
) +
f X (
x
dx
2
2
σ
1
2 log
log e
2
2
2 dx
=
(
2
πσ
) +
f X (
x
)(
x
μ)
σ
2
−∞
1
2 log
2
=
(
2
π
e
σ
)
(44)
 
Search WWH ::




Custom Search