Digital Signal Processing Reference
In-Depth Information
so-obtained surrogate ML fitting criterion are not exact ML estimates, yet usually
they have good performance and generally they are by design much simpler to com-
pute than the exact ML estimates. For example, even if the data are not Gaussian
distributed, a ML fitting criterion derived under the Gaussian hypothesis will often
lead to computationally convenient and yet accurate estimates. Another example
here is sinusoidal parameter estimation from data corrupted by colored noise: the
“ML” fitting criterion derived under the assumption that the noise is white leads
to parameter estimates of the sinusoidal components whose accuracy asymptoti-
cally achieves the exact Cramer -Rao bound (derived under the correct assumption
of colored noise), see [43, 44]. The APES method ([13, 15]) is another example
where a surrogate “ML” fitting criterion, derived under the assumption that the
data snapshots are Gaussian and independent, leads to estimates with excellent
performance. We follow the same approachinthe following chapters by extending
the APES method to the missing-data case.
Search WWH ::




Custom Search