Digital Signal Processing Reference
In-Depth Information
Proof. The mean square error can be expressed as
E e = E [ e e
x ) ]=
E [ e x ]
]= E [ e (
x
(using Eq. (16.70))
| 2
=(1
α ) E
|
x
(using Eq. (16.69))
so that
E e =(1
α )
E x ,
(16 . 74)
which proves Eq. (16.73). But in view of Eq. (16.68), we also have from
Eq. (16.69)
α ) 2 E x +
E e =(1
E τ ,
(16 . 75)
where we have used the fact that α is real. Equating the right-hand sides
of Eqs. (16.74) and (16.75) we get Eq. (16.71). Dividing (16.74) by (16.71)
yields Eq. (16.72).
16.A.1 Bias-removed estimates
From Eq. (16.67) we see that the expected value of the estimate
x, given the
transmitted symbol x, is given by
E [
x
|
x ]= αx + E [ τ
|
x ] .
(16 . 76)
Usually τ is a zero-mean random variable,
E [ τ ]=0 ,
(16 . 77)
and it is statistically independent of x . In particular, therefore, E [ τ
|
x ]=0 , and
we see that
E [
x
|
x ]= αx
= x.
(16 . 78)
Since the conditional expectation of
x is different from x , we say that the estimate
is biased . An obvious way to remove the bias is to just “kick it out.” That is,
define the bias-removed estimate
x br =
x/α . This has the form
x br = x + τ
α
(16 . 79)
where “ br ” is a reminder for “bias-removed.” Clearly the mean squared error of
the bias-removed estimate is
E τ 2 , which simplifies to
E e br =
E e br = E e α
(16 . 80)
using Eq. (16.72). Since α< 1 (unless the estimate is already unbiased), there
is an increase in the MSE due to bias removal.
Search WWH ::




Custom Search