Digital Signal Processing Reference
In-Depth Information
Historical Notes
The mathematical foundations of the theory of signals and systems have
been established by eminent mathematicians of the seventeenth and eigh-
teenth centuries. This coincides, in a way, with the advent of calculus,
since the representation of physical phenomena in terms of functions of
continuous variables and differential equations gave rise to an appropriate
description of the behavior of continuous signals and systems. Furthermore,
as mentioned by Alan Oppenheim and Ronald Schafer [219], the classical
works on numerical analysis developed by names like Euler, Bernoulli, and
Lagrange sowed the seeds of discrete-time signal processing.
The bridge between continuous- and discrete-time signal processing was
theoretically established by the sampling theorem, introduced in the works
of Harry Nyquist in 1928, D. Gabor in 1946, and definitely proved by
Claude Shannon in 1949. Notwithstanding this central result, signal process-
ing was typically carried out by analog systems and in a continuous-time
framework, basically due to performance limitations of the existing digital
machines. Simultaneously with the development of computers, a landmark
result appeared: the proposition of the fast Fourier transform algorithm by
Cooley and Tukey in 1965. Indeed, this result has been considered to be
one of the most important in the history of discrete-time signal process-
ing, since it opened a perspective of practical implementation of many other
algorithms in digital hardware.
Two other branches of mathematics are fundamental in the modern
theory of signals and systems: functional analysis and probability theory.
Functional analysis is concerned with the study of vector spaces and oper-
ators acting upon them, which are crucial for different methods of signal
analysis and representation. From it is derived the concept of Hilbert space,
the denomination of which is due to John von Neumann in 1929, as a
recognition of the work of the great mathematician David Hilbert. This is
a fundamental concept to describe signals and systems in a transformed
domain, including the Fourier transform, a major tool in signal process-
ing, the principles of which had been introduced one century before by
Jean-Baptiste Joseph Fourier.
Probability theory allows extending the theory of signals and systems
to a scenario where randomness or incertitude is present. The creation
of a mathematical theory of probability is attributed to two great French
mathematicians, Blaise Pascal and Pierre de Fermat, in 1654. Along three
centuries, important works were written by names like Jakob Bernoulli,
Abraham de Moivre, Thomas Bayes, Carl Friedrich Gauss, and many others.
In 1812, Pierre de Laplace introduced a host of new ideas and mathematical
techniques in his topic Théorie Analytique des Probabilités [175].
Since Laplace, many authors have contributed to developing a mathe-
matical probability theory precise enough for use in mathematics as well
 
Search WWH ::




Custom Search