Digital Signal Processing Reference
In-Depth Information
closed-form estimate of the morphological component
x
i
,
k
is
1
R
i
,
k
a
i
x
i
,
k
=
k
,λ
,
(9.22)
2
2
a
i
√
2
2
2
λ
=
λ/
a
i
λ
=
λ/
a
i
2
for hard threshold-
where
for soft thresholding and
ing. As described in Chapter 8, the operator
D
,λ
(
x
) consists of (1) computing the
coefficients of
x
in the dictionary
D
, (2) thresholding (soft or hard) the obtained co-
efficients with the threshold
λ
, and (3) reconstructing from thresholded coefficients
D
Thresh
λ
D
T
x
.
D
,λ
(
x
)
=
(9.23)
k
is redundant, equation
(9.22) is only the first iteration of the forward-backward splitting recursion described
in Chapter 7 (see equations (7.32) and (7.65)) and should be used when
Thresh
λ
is either a hard or a soft thresholding. When
k
is over-
complete. However, in practice, equation (9.22) can still be used to save computa-
tion time.
Now, considering
{
a
i
}
i
=
i
and all morphological components as fixed, and recall-
ing that
N
c
≥
N
s
, updating column
a
i
is then just a least squares estimate
⎛
⎞
1
⎝
Y
⎠
s
i
,
a
i
s
i
a
i
=
−
(9.24)
2
2
s
i
i
=
i
where
s
i
=
k
=
1
x
i
,
k
. This estimate is then projected onto the unit sphere to meet
the unit
2
norm constraint in equation (9.19). The GMCA algorithm is summarized
in Algorithm 34.
For
p
, Algorithm 34 can be shown to converge to a
stationary point; see Tseng (2001) and Bobin et al. (2008a). This point is not guar-
anteed to be even a local minimum of the energy, and this is even less clear for
p
=
1 and fixed threshold
λ
0. Thus, in the same vein as MCA, GMCA relies on a salient-to-fine strategy us-
ing a varying threshold to mitigate the problem of sensitivity to initialization. More
precisely, GMCA first computes coarse versions of the morphological components
for any fixed source
s
i
. These raw sources are estimated from their most signifi-
cant coefficients in
=
. Then the corresponding column
a
i
is estimated from the most
significant features of
s
i
. Each source and its corresponding column of
A
are then
alternately and progressively refined as the threshold decreases toward
λ
min
.This
particular iterative thresholding scheme provides robustness to noise and initializa-
tion by working first on the most significant features in the data and then progres-
sively incorporating smaller details to finely tune the model parameters. GMCA can
be used with either linear or exponential decrease of the threshold, as for MCA in
Chapter 8.
If
A
were known and fixed, the GMCA would be equivalent to performing an
MCA sparse decomposition of
Y
in the tensor product multichannel dictionary
A
. But as GMCA also updates the mixing matrix at each iteration, it is able
to learn the spectral part of the multichannel dictionary directly from the data.
⊗