Digital Signal Processing Reference
In-Depth Information
highly correlated signals, various families of adaptive algorithms have been
proposed in the literature. 18 RLS algorithms, for example, can offer faster con-
vergence rates, but in general require computational resources, often too large
for a real-time implementation. On the other hand, their fast versions, i.e.,
the fast recursive least squares (FRLS) algorithms, still suffer from numerical
problems when implemented with finite-word-length arithmetics. Another
class of algorithms that are able to offer better convergence and tracking rates
than LMS algorithms, i.e., the so-called AP algorithms, were proposed by
Ozeki and Umeda about two decades ago. It has been reconsidered quite
recently and fast versions of AP algorithms, i.e., fast affine projection (FAP)
algorithms, have been developed. 16 , 26 , 41 These algorithms retain the good
performance of AP algorithms but with a computational load close to that
of LMS algorithms. The significant aspect of this class of algorithms is to of-
fer, in the presence of correlated signals, convergence rates higher than LMS
algorithms and tracking capabilities better than RLS algorithms. 17 For this
reason, they are particularly suitable for AEC implementation; fast conver-
gence and tracking capability are important features for AECs because of the
highly time-varying nature of the echo path.
An advantage of polynomial models is that all these adaptation techniques
can, in principle, be extended to Volterra filters as their output is still lin-
ear with respect to the filter coefficients. 24 The drawback is that the compu-
tational requirements are often too high even for a truncated second-order
Volterra filter. Even when the most economic LMS and NLMS algorithms are
adopted, the implementation complexity of a nonlinear AEC is not suitable
for acoustic echo cancellation in real time. In addition, when these algorithms
are applied to Volterra filters, the convergence rate becomes too low because
of the ill-conditioned nature of the input autocorrelation matrix. 22 The NLMS
algorithm may improve such a rate, but it is not the ultimate solution.
In this section, after a brief introduction to the classical LMS and NLMS
algorithms, we show how it is possible to extend the basic AP algorithm to
quadratic Volterra filters. To obtain an efficient implementation, the diagonal
realization structure employed by SVFs is exploited.
7.3.1
LMS and NLMS Algorithms
The objective of an adaptive filter is to process the input signal x
(
n
)
so that its
output signal y
.Intheory, this objective is
reached by iteratively minimizing the statistical expectation of some convex
cost function of the error signal
(
n
)
is close to a reference signal d
(
n
)
e
(
n
) =
d
(
n
)
y
(
n
)
(7.16)
using a steepest-descent technique. This technique consists of updating the
filter coefficients at the time n
1 using a small fraction of the partial deriva-
tives of the cost function taken with respect to the filter coefficients computed
at time n .Inpractice, to obtain a realizable algorithm, so-called stochastic
+
Search WWH ::




Custom Search