Digital Signal Processing Reference
In-Depth Information
16.A.2 Optimum unbiased estimate
The preceding method to create an unbiased estimate might appear to be rather
“ad hoc.” But it turns out that no unbiased estimate can be better than this!
To be more quantitative, assume that
x other = x + θ
(16 . 81)
is a linear estimate of x obtained from the same observations
using some
“other” procedure. This is clearly an unbiased estimate with mean squared error
{
y k }
E other =
E θ .
(16 . 82)
We now prove the following:
Lemma 16.5. Assume the error θ has zero mean, and that it is statistically
independent of x (in particular, E [ ] = 0).
E other of the
unbiased estimate (16.81) can never be better than that of the bias-removed
MMSE estimate. In other words E other ≥E e br .
Then the MSE
Proof. Assume the contrary, that is,
E e α
E θ =
E other <
E e br
=
(from Eq. (16.80))
E e
=
(from Eq. (16.73))
E e
E x
1
1
=
1
E e
E x
This is readily rearranged as
E θ E x
E θ +
<
E e .
(16 . 83)
E x
Now define a new linear estimate of x as follows:
x new = E x
E x +
x other = E x
E x +
( x + θ ) .
(16 . 84)
E θ
E θ
The error is given by
x =
θ
E θ
E x +
x.
E x
E x +
e new =
x new
E θ
E θ
Since E [ ] = 0, this has mean square value
E e,new =
2
E θ +
2
E x
E x + E θ
E θ
E x + E θ
E x = E x E θ
E x + E θ
<
E e
(16 . 85)
using Eq. (16.83). Thus
E e >
E e,new , which contradicts the fact that
E e
is the minimum mean squared error.
 
Search WWH ::




Custom Search