Digital Signal Processing Reference
In-Depth Information
from Ex. 6.2. So f u ( u ) can be rewritten as
1
2 πσ x / 2
1
2 πσ x / 2
e −x re / 2( σ x / 2) ×
e −x im / 2( σ x / 2) .
f u ( u )=
(6 . 62)
Since σ x re = σ x im = σ x / 2, this is nothing but the product of the individual
pdfs of x re and x im . This happens because, in the Gaussian case, the uncor-
relatedness of x re and x im (induced by the circular symmetry of x ) implies
that they are statistically independent as well.
Indeed, given any pdf of the form (6.60), it can always be rewritten in the
form (6.62) which shows that such an x has uncorrelated real and imaginary
parts with identical variances, showing that the expression (6.60) always
represents a circularly symmetric complex Gaussian x. The pdf of a more
general complex zero-mean Gaussian x would have to be expressed in the
form
x re
x im
,
det(2 π C uu ) e 2 [ x re
x im ] C 1
uu
1
f ( x re ,x im )=
(6 . 63)
where
C uu = σ x re ρ
,
ρ x im
with ρ = E [ x re x im ].
6.6.4 Entropy of Gaussian random vectors
The differential entropy [Cover and Thomas, 1991] of a real random vector u
with pdf f u ( u ) is defined as
f u ( u )ln f u ( u ) d u ,
H
( f u )=
(6 . 64)
where the integration is over all the components of u . When there is no confusion
of notations we indicate
( u ).
If x is a complex Gaussian vector, then its pdf can be expressed as in Eq.
(6.54) where u is as in Eq. (6.53) and C uu is its covariance. Then the entropy
evaluated using Eq. (6.64) is given by
H
( f u ) by the simpler notation
H
( f u )=ln det (2 πe C uu ) .
H
(6 . 65)
Note that the mean value of the random vector plays no role in this expression.
For the special case where x m x is circularly symmetric, Eq. (6.46) holds. So,
when a complex Gaussian vector x is such that x m x is circularly symmetric,
the differential entropy is
H
( f x )=lndet( πe C xx ) .
(6 . 66)
Search WWH ::




Custom Search