Digital Signal Processing Reference
In-Depth Information
⎧
⎨
+
2
,
≤
−
2
a
if
a
T
λ
(
a
)=
| <
2
0
,
if
|
a
⎩
−
2
,
≥
2
.
a
if
a
The iterates
y
n
+
1
1 [45]. Similar
results can also be obtained using the hard-thresholding instead of the soft-
thresholding in (
2.14
)[11].
Other methods for solving (
2.13
) have also been proposed. See for instance
GPSR [61], SPGL1 [8], Bregman iterations [159], split Bregman iterations [65],
SpaRSA [157], and references therein.
converge to the solution of (
2.9
), ˆ
α
if
A
2
<
2.3.2.2
Greedy Pursuits
In certain conditions, greedy algorithms such as matching pursuit [88], orthogonal
matching pursuit [109], [138], gradient pursuits [13], regularized orthogonal match-
ing pursuit [94] and Stagewise Orthogonal Matching Pursuit [49] can also be used to
recover sparse (or in some cases compressible)
from (
2.8
). In particular, a greedy
algorithm known as, CoSaMP, is well supported by theoretical analysis and provides
the same guarantees as some of the optimization based approaches [93].
Let
T
be a subset of
α
{
1
,
2
,···,
N
}
and define the restriction of the signal
x
to the
set
T
as
x
i
,
i
∈
T
x
|
T
=
otherwise
Let
A
T
be the column submatrix of
A
whose columns are listed in the set
T
and define the pseudoinverse of a tall, full-rank matrix
C
by the formula
C
‡
0
,
=
C
∗
C
)
−
1
C
∗
.
(
Using this notation, the pseudo-code for
CoSaMP is given in Algorithm
1
which can be used to solve the under-determined
system of linear equations (
2.4
).
Let
supp
(
x
)=
{
x
j
:
j
=
0
}.
2.3.2.3
Other Algorithms
Recently, there has been a great interest in using
1for
compressive sensing [37]. It has been observed that the minimization of such a
nonconvex problem leads to recovery of signals that are much less sparse than
required by the traditional methods [37].
Other related algorithms such as FOCUSS and reweighted
p
minimization with
p
<
1
have also been
proposed in [68] and [29], respectively.
Search WWH ::
Custom Search