Information Technology Reference
In-Depth Information
exp
( x − ( a + b ) / 2) 2
( σ/ 2) 2
1
2 π
1
2
g ( x ; a, σ ) g ( x ; b, σ )=
σ
2
2 π ( 2 σ ) exp
=
( a
b ) 2
( 2 σ ) 2
1
1
2
= g x ; a + b
2
g (0; a
b, 2 σ ) .
σ
2
,
(F.7)
Since the integral of a Gaussian function is 1, we then obtain:
+
b, 2 σ ) .
g ( x ; a, σ ) g ( x ; b, σ ) dx = g (0; a
(F.8)
−∞
Applying this theorem to the integral in (F.5), we get (using the kernel no-
tation)
n
n
1
n 2
H R 2 ( X )=
.
ln
G 2 h ( x i
x j )
(F.9)
i =1
j =1
We then obtain an estimate of Rényi's quadratic entropy directly computable
in terms of Gaussian functions.
Note that the MSE consistency of the Parzen window estimate (see Ap-
pendix E) directly implies the MSE consistency of this H R 2
estimate.
F.3 Plug-in Estimate of Shannon's Entropy
f ( x )ln f ( x ) dx ,whichisthe
expected value of ln f ( x ), we plug-in the PDF estimate in the empirical for-
mula of the expectation and obtain [1]:
When H is the Shannon entropy, H S ( x )=
n
1
n
H S ( X )=
ln f n ( x i ) .
(F.10)
i =1
When f n ( x ) is obtained by the Parzen window method with kernel K and
bandwidth h ,wehave:
n
n
1
n
H S ( X )=
ln
K h ( x i
x j ) .
(F.11)
i =1
j =1
H S ( X ) estimate enjoys the following consistency properties [1, 159]:
The
Search WWH ::




Custom Search