Digital Signal Processing Reference
In-Depth Information
method. In fact, this technique is presented as a kind of first
step toward truly adaptive methods, which constitute our main
interest.
Section 3.4 deals with the adaptive case and presents a most rel-
evant technique: the least mean square ( LMS ) algorithm . We present
LMS in a rather canonical way, i.e., as a stochastic approximation of
the steepest-descent method.
In Section 3.5 , we introduce the method of least squares ( LS ), which is
the application of the LS estimation criterion to a linear FIR filter. In
contrast with Wiener theory, the LS solution does not involve sta-
tistical averages but depends on a set of available data. The optimal
solution can also be obtained in a recursive way, which gives rise to
the so-called recursive least squares ( RLS ) algorithm .
Although the main interest of this chapter is in linear FIR filters,
Section 3.6 discusses alternative approaches, for the sake of com-
pleteness. A more in-depth discussion on nonlinear structures will
be presented in Chapter 7.
In Section 3.7 , we turn our attention to the problem of filtering when
a set of constraints on the filter parameters replaces the reference sig-
nal in the optimization process. Our main motivation is presenting
the constrained filtering case as a sort of bridge between the linear
filter theory and the non-supervised problem.
Finally, in Section 3.8 , we revisit the special case of linear prediction ,
in order to discuss some particularly important properties. This dis-
cussion results in a connection between prediction and equalization
problems, which will be exploited in subsequent chapters.
Historical Notes
As is usual in textbooks, and for didactic reasons, this chapter starts from a
discussion about Wiener theory, which is founded on the MMSE estimation
criterion. However, this precedence is not historical, since the LS method, the
development of which is attributed to Gauss, dates from 1795 [119], although
Legendre first published it in 1810 [178].
The development of estimation theory in a stochastic context is derived
from the advances in statistic inference and probability theory. The appli-
cation of the MMSE criterion in the linear prediction problem gave rise
to the modern filtering theory , thanks to the works of Kolmogorov [170],
in 1939, and of the American mathematician Norbert Wiener during the
1940s, the latter definitely published in 1949 [305]. Kolmogorov oriented his
work toward discrete-time stationary processes, and his works were comple-
mented by those of Mark Krein, an eminent pupil of his. Wiener formulated
 
Search WWH ::




Custom Search