Digital Signal Processing Reference
In-Depth Information
2
12
σ
()
ˆ
ω
var
(
)
2
2
AN N
1
2
σ
()
ˆ
var
A
N
2
12
σ
Q
()
ˆ
var
ϕ
(
)
22
2
AN
N
1
We note that there does not exist an efficient estimator with finite N in this case;
see the previous discussion.
Let us now consider the second case where the mean is () .
s θ It thus
consists of random signals with zero mean and covariance matrix R N (θ). The Fisher
information matrix is obtained from the derivatives of the covariance matrix in the
following manner:
= 0
N
()
()
R
θ
R
θ
1 Tr
2
N
N
()
1
()
1
()
F
θ
=
R
θ
R
θ
N
N
N
ij
∂θ
∂θ
i
j
While looking for an existence condition of an estimator with minimum
variance, we notice rapidly that we end up with implicit equations so that it becomes
practically impossible to calculate the estimator with minimum variance.
Nevertheless, the expression of the Fisher information matrix sometimes helps attain
simple expressions, as shown in the following example.
EXAMPLE 3.2. We try here to calculate the Cramér-Rao bound for an AR process
of p order with Gaussian excitation, for which the parameter vector may be written
T
2
as
Friedlander and Porat [FRI 89] demonstrated, using the
previous expression, that the Fisher information matrix could be written in this case
as:
θ
=
a
1
a
.
p
) ( ) 1
σ
4
2
0
() ()(
θ
θ
F
=
F
+
Np
N
0
R
p
where ()
F θ is a matrix that does not depend on N and where, for
( )
(
)
( )
k
,
=
1,
, ,
p
R
k
,
= γ
k
with
γ
.
as the correlation function of the
p
xx
xx
Search WWH ::




Custom Search