Digital Signal Processing Reference
In-Depth Information
Finding a solution to equation (6.18) amounts to solving a convex optimization
problem, which can be cast as a linear programming (LP) problem and solved using
interior-point methods. However, the computational complexity of the LP solver
increases dramatically with the size of the problem. A much more effective alterna-
tive is the versatile splitting framework that will be developed in detail in Chapter 7.
Here we discuss yet another approach that has been proposed by Starck et al. (2001)
based on an adaptation of the hybrid steepest descent (HSD) algorithm (Yamada
2001) to nonsmooth functionals. HSD allows the minimization of smooth convex
functionals over the intersection of fixed point sets of nonexpansive mappings. It is
much faster than LP, and in the problem of the form (6.18), nonexpansive mappings
have closed forms (Zhang et al. 2008b).
The denoising method described here can be seen as a particular instance of the
combined filtering method detailed in Section 8.3.
6.2.3.2 Maximum A Posteriori Estimator
Let us now go back to solving equation (6.16) when
is redundant. For the sake
of simplicity, we assume a global parameter t , but it is not difficult to generalize the
discussion to a subband-dependent set of parameters t j , . This MAP minimization
problem has no closed form solution, and iterative algorithms must be used.
For example, if the dictionary
L ] is built as a union of L or-
thonormal transforms, the block-coordinate relaxation (BCR) algorithm was pro-
posed by Sardy et al. (2000) as an effective numerical solver. This algorithm breaks
the coefficient vector
=
[
···
1
2
α
into L parts, each referring to an orthonormal transform
in
. The BCR algorithm addresses one set of representation coefficients at a time
(say,
α l ), assuming all the others are fixed, and applies soft thresholding with thresh-
old t to the marginal residuals Y
l = l l α l . This process is repeated by itera-
tively cycling through the set of coefficients.
For a general dictionary
, the sparse recovery problem (6.16) can be seen as a
linear inverse problem and is a special instance of the sparsity penalized form (7.26)
considered in Chapter 7. It can be solved effectively using iterative thresholding, as
detailed in Algorithm 24 or 23. The reader can easily verify that equation (6.2) with
soft thresholding can be viewed as the first iteration of Algorithm 24. 1 (Elad 2006)
6.3 BLOCK NONLINEAR DENOISING
The term-by-term thresholding achieves a degree of trade-off between variance and
bias contribution to the MSE risk. However, this trade-off is not optimal; it removes
too many terms from the observed coefficient expansion, with the consequence that
the estimator is too biased and has a suboptimal MSE convergence rate. One way to
increase estimation precision is by exploiting information about neighboring coeffi-
cients. In other words, the observed transform coefficients tend to form clusters that
could be thresholded in blocks (or groups), rather than individually. This would
allow threshold decisions to be made more accurately and permit convergence
rates to be improved. Such a procedure has been introduced by Hall et al. (1998,
1 Rigorously speaking,
should be a tight frame for this statement to hold.
Search WWH ::




Custom Search