Biomedical Engineering Reference
In-Depth Information
and
0;
if jz i
j 1 =2;
e
x 1;i =
j 2 );
sign(z i )(jz i
if jz i
j> 1 =2:
Here,
e
x 0;i and
e
x 1;i denote the ith entry of
e
x 0 and
x 1 , respectively, and z i
e
is the ith entry of z = T y.
For
readers
who
are
familiar
with
soft-thresholding
and
hard-
thresholding 12 , the above is not a surprise. Proof is omitted.
From the above, verifying the following becomes an easy task. Let
supp(x) denote the set of indices of the nonzero entries in vector x.
p
Corollary 7: When
0 = 1 =2, one has supp(
x 0 ) = supp(
e
x 1 ), i.e., there
e
is a concurrent optimal subset. Moreover,
0;
if
i =2supp(
x 0 );
e
e
x 0;i
x 1;i =
e
2
sign(z i );
if
i2supp(
x 0 ):
e
The proof is obvious and is omitted.
Now there are two opposing examples. On one hand, if is orthogonal,
both LARS and Lasso discover the optimal subset in (P0). On the other
hand, we found an example in which a version of LARS would choose all
the covariates outside the optimal subset before choosing anything inside.
These inconsistencies encourage us to analyze the solutions of (P0) and
(P1), and the conditions for a subset to be the concurrent optimal subset.
This is the place where more results are anticipated. Readers may see details
in more technical papers thereafter.
5. Other Topics
We must admit that this article presents a somewhat unique aspect of
the model selection problem. In the following, we discuss other works and
possibly their relation with the theme of this article.
5.1. Computing Versus Statistical Properties
As mentioned earlier, the question that we addressed in this paper is quite
dierent from many other statistical works. In the present paper, we identify
easy-to-verify (polynomial time) conditions for the type-I optimal subset.
Our direct motivation is that certain greedy algorithm can nd a path of
type-II optimal subsets. If one of these type-II optimal subset is conrmed
Search WWH ::




Custom Search