Information Technology Reference
In-Depth Information
This last result is of a monotonicity theorem first proved by A. Stam in
1959 [222]. It is also presented and discussed in [50]. Sometimes this (and
other) results are expressed in terms of the entropy power defined as N ( X )=
1
2 πe e 2 H S ( X ) . We now present a Corolary of this theorem justifying larger
Shannon entropy of Gaussian smoothed distributions (see Chap. 3).
Corollary B.1. Let X and Y be independent continuous random variables
and f Z = f X
f Y .If Y is Gaussian distributed with variance h 2 ,then
H S ( Z )= H S ( X + Y )
H S ( X ) .
(B.3)
Proof. From the monotonicity theorem we have:
N ( X + Y )
N ( X )+ N ( Y ) .
(B.4)
2 πe e 2ln( h 2 πe ) =
2 πe e 2 H ( Y ) =
1
1
Since Y is a Gaussian r.v. we also have N ( Y )=
h 2 > 0, and the above result follows.
B.2 Rényi's Entropy
α ln
1
1
f α ( x ) dx =
[ f α− 1 ( x )]
H R α ( X )=
α ln
E
0
=1
1
1
X
(B.5)
A list of important properties of Rényi's entropy [47, 62] is:
1. H R α can be positive or negative, but H R 2 is non-negative with minimum
value (0) corresponding to a Dirac- δ comb.
2. Invariance to translations: H R α ( X + c )= H R α ( X ) for a constant c .
3. Change of scale: H R α ( aX )= H R α ( X )
α ) for a constant a .
4. With the conditional entropy defined similarly as for the Shannon entropy
(Property 4 of Sect. B.1), the inequality H R α ( X
ln
|
a
|
/ (1
|
Y )
H R α ( X ) only holds
for α
1 and f ( x )
1 in the whole support.
5. H R α ( X 1 ,...,X n )= i =1 H R α ( X i ) for independent random variables.
6. For α =2and univariate distributions with finite variance σ 2 ,themaxi-
mizer of Rényi's quadratic entropy is
Γ (2) 1
with support
x 2
5 σ 2
Γ (5 / 2)
1
5 πσ
5 σ . The general formulas of the maximizing density of the α -Rényi
entropy are given in [47]. Note that the Rényi entropy of the univariate
|
x
|≤
normal distribution [162] is H R α ( g ( x ; μ, σ )) = ln( 2 πσ )
1
2 ln α/ (1
α );
therefore, H R 2 ( g ( x ; μ, σ )) = ln(2 σ π ).
7. H R α ( X )
α→ 1
−−−→
H S ( X ).
Note that a result similar to the one in Corolary B.1 for Rényi's quadratic
entropy is a trivial consequence of properties 1 and 5.
Search WWH ::




Custom Search