Information Technology Reference
In-Depth Information
computational cost, instead of deciding whether the
j
-th variable,
X
j
, is selected or not
based on the posterior probability in Eq. (12) directly, we adopt another method as be-
low. If the current variable is not selected in the union of the support sets, i.e.,
ʴ
j
=0,
we propose to active this variable first by setting
ʴ
j
=1, and sample the individual in-
dicators
ʷ
j,m
and coefficients
ʲ
j,m
from the corresponding conditional distributions via
the component-wise Gibbs sampling approach in Chen et al. (2011) [2], i.e. the Step 3
in Algorithm 2. We then decide whether to keep the sampled indicators and coefficients
via the Metropolis-Hastings acceptance-rejection rule. Conversely, if the variable is se-
lected in
S
,i.e.
ʴ
j
=1, we then propose to turn down this indicator by switching
ʴ
j
to 0,
and setting all the corresponding indicators
ʷ
j,m
and coefficients
ʲ
j,m
to be zero. There-
fore, we determine whether to accept this proposal or not via the Metropolis-Hastings
acceptance-rejection rule, too. Thus this proposed method can be treated as the sample
version of the two-layer Gibbs sampler. The details of these stages are shown in the
following.
Let
ʘ
j
=(
ʴ
j
,ʷ
(
j
)
,ʲ
(
j
)
) be the parameter set for the
j
th variable,
X
j
,where
ʷ
(
j
)
=
(
ʷ
j,
1
,
,ʷ
j,M
),and
ʲ
(
j
)
=(
ʲ
j,m
,
,ʲ
j,M
) are the corresponding second-layer in-
dicators and coefficients. The proposed transition of
ʘ
j
can be defined as
···
···
ʘ
j
)=
P
(
ʲ
(
j
)
, ʷ
(
j
)
T
(
ʘ
j
ₒ
R
(
j
)
,ʴ
j
=1
,˃
)
|
(14)
T
(
ʘ
j
ₒ
ʘ
j
)=1
,
(15)
=(
ʴ
j
=1
, ʷ
(
j
)
, ʲ
(
j
)
),
R
(
j
)
=(
ʴ
j
=0
,ʷ
(
j
)
=
0
,ʲ
(
j
)
where
ʘ
j
=
0
),
ʘ
j
=
ʲ
(
j
)
, ʷ
(
j
)
(
R
j,
1
,
···
,R
j,M
),and
{
}
are sampled from the joint posterior distribution.
Here
T
(
ʘ
j
ʘ
j
) is the proposal distribution for changing
ʴ
j
ₒ
from 0 to 1, and
T
(
ʘ
j
ₒ
ʘ
j
) is the proposal distribution to switch
ʴ
j
to 0. Suppose the variable
X
j
is
not included in
S
currently, i.e.
ʴ
j
=0. Then after sampling
ʷ
j,m
and
ʲ
j,m
by setting
ʴ
j
=1, we calculate the acceptance probability
A
j
as:
A
j
(
ʘ
j
ₒ
ʘ
j
)
=
P
(
ʘ
j
)
T
(
ʘ
j
ₒ
ʘ
j
)
P
(
ʘ
j
)
·
T
(
ʘ
j
ₒ
ʘ
j
)
P
(
ʴ
j
=1
, ʷ
(
j
)
, ʲ
(
j
)
|
Y
,ʴ
−j
,ʷ
(
−j
)
,ʲ
(
−j
)
,˃
)
P
(
ʴ
j
=0
,ʷ
(
j
)
=
0
,ʲ
(
j
)
=
0
1
=
,ʴ
−j
,ʷ
(
−j
)
,ʲ
(
−j
)
,˃
)
·
P
(
ʲ
(
j
)
, ʷ
(
j
)
|
Y
R
(
j
)
,ʴ
j
=1
,˃
)
|
=
M
ʲ
j,m
, ʷ
j,m
,ʴ
j
=1
,ʴ
−j
,ʲ
−j,m
,˃
)
P
(
ʲ
j,m
, ʷ
j,m
|
P
(
Y
m
|
ʴ
j
=1)
1
−
ʸ
j
×
P
(
Y
m
|ʴ
j
=0
,ʴ
−j
,ʲ
−j,m
,˃
)
ʸ
j
m
=1
1
×
m
=1
P
(
ʲ
j,m
|
ʷ
j,m
,R
j,m
,ʴ
j
=1
,˃
)
P
(
ʷ
j,m
|
R
j,m
,ʴ
j
=1
,˃
)
1
exp
M
˃
j,m
˄
j,m
·
˃
2
+
˄
j,m
X
j
X
j
2
˄
j,m
˃
2
ʲ
j,m
+
R
j,m
X
j
˃
2
−
ˁ
j,m
p
j,m
ʲ
j,m
=
·
−
m
=1
ʷ
j,m
⊤
⊦
×
ˁ
j,m
1
(1
−ʷ
j,m
)
+
(
ʲ
j,m
−
M
r
j,m
)
2
2
˃
∗
2
j,m
−
ʸ
j
1
·
(16)
−
p
j,m
ʸ
j
m
=1
Search WWH ::
Custom Search