Digital Signal Processing Reference
In-Depth Information
amplitude modulation (PAM) signals. The approach gave rise to a number
of interesting algorithms, particularly throughout the 1980s, many of which
have been derived from an intuitive starting point.
A first theoretical landmark was the work of Benveniste et al. [41] in 1980,
which stated fundamental conditions for blind deconvolution. Moreover, the
authors proposed a class of unsupervised algorithms, which encompasses
the Sato algorithm, and studied their convergence properties. Complexity
was an issue, as the method required the equalization of probability density
functions (pdfs) or, equivalently, of all the infinite higher-order statistics of
the involved signals.
Also in 1980, Dominique Godard proposed a new class of cost functions
to be applied to complex signals, such as quadrature amplitude modula-
tion (QAM) signals [118]. Later, in 1983, Treichler and Agee exploited the
structural properties of the transmitted signal to design a cost function. In
particular, the idea of restoring the CM properties of some modulations
was used in the constant modulus algorithm (CMA) [292], probably the
most investigated unsupervised adaptive algorithm for blind equalization.
Interestingly, the CMA is identical to one of the members of the class of algo-
rithms proposed by Godard. That is why, in general, credit is given to both
works for the formulation of the approach.
The CM, Sato, and other algorithms were shown to belong to the
class of the so-called Bussgang algorithms, introduced by Godfrey and
Rocca [125] and Bellini and Rocca [36]. In [35], Bellini provides an inter-
esting survey of Bussgang methods. The term “blind equalization” seems
to have been first introduced by Benveniste and Goursat in a paper that
appeared in 1984 [40], in which the authors proposed an update proce-
dure composed by a combination between both the DD and Sato algo-
rithms. In 1987, Picchi and Prati proposed the “stop-and-go” algorithm [239],
which, again, combined the Sato and DD strategies to reach a procedure
that continues or stops the adaptation process, depending on a reliability
criterion.
A second theoretical landmark occurred in 1990 when Shalvi and
Weinstein [269] significantly simplified the conditions for blind deconvolu-
tion as previously stated by Benveniste et al. Before their work, the general
belief was that infinite statistics were required to guarantee ZF equalization.
Shalvi and Weinstein showed that ZF equalization can be achieved if only
two statistics of the involved signals are equalized. Actually, they proved
that, if the fourth-order cumulant (kurtosis) is maximized and the second-
order cumulant remains the same, then the recovered signal would be a
scaled and rotated version of the transmitted signal. Later, they also proved
that other higher-order statistics could be used to ensure perfect equaliza-
tion [271]. This result was very important to provide theoretical support
to the proposition of blind equalization criteria and algorithms with low
complexity burden. In 1993, the same authors proposed a cumulant-based
algorithm called super-exponential algorithm [270].
Search WWH ::




Custom Search