Digital Signal Processing Reference
In-Depth Information
Algorithm 27
Douglas-Rachford Splitting to Solve (
P
2
σ
) with Tight Frames
(0)
T
,
Initialization:
Choose some
α
∈ R
μ
∈
(0
,
2),
γ>
0, and
σ
≥
0.
Main iteration:
for
t
=
0
to
N
iter
−
1
do
1.
Compute the residual: r
(
t
)
=
y
−
F
α
(
t
)
.
2.
Projection onto the
2
ball of radius
σ
:
r
(
t
)
1
−
σ
(
t
)
r
(
t
)
r
(
t
)
ζ
=
−
.
(7.49)
+
3.
Compute the reflection projector:
(
t
+
1
/
2)
(
t
)
2
c
−
1
F
∗
(
r
(
t
)
(
t
)
)
α
=
α
+
−
ζ
.
(7.50)
4.
Soft thresholding:
2
2SoftThresh
2)
2)
+
+
/
+
/
(
t
1)
(
t
)
(
t
1
(
t
1
α
=
(1
−
μ/
2)
α
+
μ/
α
−
α
.
(7.51)
γ
Output:
Solution of (
P
2
σ
(
N
iter
−
1
/
2)
.
):
α
e
−
ν
√
m
/
8
if
condition holds with probability higher than 1
−
Var
ε
= E
ε
2
+
ν
2
.
2
σ
(7.52)
1
2
2
σ
=
√
m
Taking
ν
=
2, it follows easily that
σ
ε
+
/
m
ensures feasibility with
e
−
√
m
/
2
. Similar arguments can be developed for other
probability greater than 1
−
q
constraints in
P
q
(Jacques et al. 2009).
σ
7.4.3 The Dantzig Selector
Let us now turn to the problem
F
∗
(
y
)
∞
≤
τ.
(
P
DS
): min
α
∈R
α
.
.
−
α
(
)s
t
F
(7.53)
T
α
=
F
∗
y
To avoid the unique trivial solution
0, we assume that
τ<
∞
. The com-
pact convex constraint set in equation (7.53) is denoted
C
DS
. The Dantzig selector
(Candes and Tao 2007) is when
1
norm, in which case, (
P
DS
) can be recast
as a linear program. The solution of this problem has many appealing properties
from a statistical estimation standpoint and has been studied by Cand es and Tao
(2007), Bickel et al. (2009), and others.
is the
7.4.3.1 Algorithm and Convergence
Let
G
F
∗
y
. Again, by straight-
forward application of the results of Section 7.3.2.3, (
P
DS
) can be solved using the
F
∗
F
be the (self-adjoint) Gram operator and
z
=
=