Digital Signal Processing Reference
In-Depth Information
4.5.2 Stability Analysis for a Large Class of Algorithms
In this section we will analyze the stability of a large family of adaptive algorithms
that encompasses several of the ones described in this chapter. The general form
assumed for the algorithms is:
w
(
n
) = α
w
(
n
1
) + μ
f
(
x
(
n
))
e
(
n
),
(4.52)
where
α
is a positive number, typically 0
1,
μ =
diag
1 2 ,...,μ L )
,
L is an arbitrary multidimensional function. The recursion
( 4.52 ) is sufficiently general to encompass several basic algorithms. For example,
when
L
μ i
>
0, and f
: R
→ R
α =
1,
μ = μ
I L and f
(
x
(
n
)) =
x
(
n
)
we obtain the LMS algorithm. With the
same choices for
α
and
μ
and f
(
x
(
n
)) =
sign [ x
(
n
)
] we obtain the SDA, and with
x
(
n
)
f
2 we get the NLMS algorithm. With ( 4.52 ) we can also obtain other
interesting variants as the multiple step size LMS and the Leaky LMS [ 19 , 34 ].
Wewill obtain stability results for this general model and then specialize the results
for some of the algorithms presented in this chapter. It is clear from ( 4.52 ) that the sign
error algorithm discussed in 4.4.1 cannot be analyzed within this framework. This
is because that algorithm presents a nonlinearity in the error, and the model ( 4.52 )
covers only algorithms that are linear in the error. For an excellent and rigorous
analysis of the sign error algorithm see [ 12 ].
The stability analysis presented here will be limited to two important measures:
mean stability and mean square deviation (MSD) stability. Using the misalignment
vector defined in (2.23) , we will say that an algorithm is mean stable if
(
x
(
n
)) =
x
(
n
)
E ˜
) < .
lim
n
w
(
n
(4.53)
→∞
E ˜
) are finite. In an analog manner, we
That is, all the components of lim n →∞
w
(
n
will say that an algorithm is MSD stable when
E
2
lim
n
˜
w
(
n
)
< .
(4.54)
→∞
The following are some important remarks. It is clear that in ( 4.53 ) it is included the
important case in which w
is an asymptotically unbiased estimator. Although the
unbiasedness property is a very useful one, in general one should not think that it is
the only important property we should look at in an adaptive algorithm. It is clear
that this property will be useless if the asymptotical dispersion of w
(
n
)
(
)
around its
mean value is very large or infinite. Although a proper description of that dispersion
n
will be given by the correlation matrix E ˜
2
will be useful too, and easier to handle. Then, the condition ( 4.54 ) permits a more
insightful view on the asymptotic stability performance of an adaptive algorithm. In
fact, it can be easily shown that the MSD stability implies mean stability.
We will begin our stability analysis using ( 4.48 )into( 4.52 ), leading to
) , the quantity E ˜
w T
w
(
n
) ˜
(
n
w
(
n
)
Search WWH ::




Custom Search