Information Technology Reference
In-Depth Information
Estimation by Gradient Descent
The problem of computing (5.62) can be reformulated as finding the minimum
of
m ( x n ) τ 1
y n ) 2 2
N
( w N x n
.
(5.64)
n =1
That the minimum of the above with respect to τ is indeed (5.62) can be easily
shown by the solution of setting its gradient with respect to τ to zero.
This minimisation problem can now be solved with any gradient-based me-
thod. Applying the LMS algorithm, the resulting update equation is given by
+ γm ( x N +1 ) ( w N +1 x N +1
.
τ 1
N +1 = τ 1
y N +1 ) 2
τ 1
N
(5.65)
N
While this method provides a computationally cheap approach to estimating
the noise precision, it is flawed in several ways: firstly, it suffers under some
circumstances from slow convergence speed, just as any other gradient-based
method. Secondly, at each step, the method relies on the updated weight vector
estimate, but does not take into account that changing the weight vector also
modifies past estimates and with it the squared estimation error. Finally, by
minimising (5.64) we are computing the biased estimate (5.62) rather than the
unbiased estimate (5.63). The following method address all of these problems.
Estimation by Direct Tracking
Assume that the sequence of weight vector estimates
satisfies the
Principle of Orthogonality, which we can achieve by utilising the RLS algorithm.
In the following, a method for incrementally updating
{
w 1 , w 2 ,...
}
2 M N is
derived, which then allows for accurate tracking of the unbiased noise precision
estimate (5.63).
At first, let us derive a simplified expression for
X N w N
y N
X N w N
y N
2 M N : based on
the Corollary to the Principle of Orthogonality (5.17) and
y N =
X N w N +
( X N w N
y N )weget
y N M N y N = w N X N M N X N w N
2 w N X N M N ( X N w N
y N )
y N ) T M N ( X N w N
+( X N w N
y N )
= w N X N M N X N w N +
2
M N ,
X N w N
y N
(5.66)
which, for the sum of squared errors, results in
2
M N
= y N M N y N
w N X N M N X N w N .
X N w N
y N
(5.67)
2
M N +1
2
Expressing
M N requires
combining (5.31), (5.32) and (5.67), and the use of Λ N w N = X N M N y N after
(5.30), which, after some algebra, results in the following:
X N +1 w N +1
y N +1
in terms of
X N w N
y N
 
Search WWH ::




Custom Search