Biomedical Engineering Reference
In-Depth Information
the posterior distribution and the values of hyperparameters by repeating the E and
M steps. As a result of this recursive procedure, the marginal likelihood p
(
| ʦ , ʛ )
y
is increased. A proof may be found, for example, in [1].
B.5.6 Hyperparameter Update Equations when ʦ = ʱ I
and ʛ = ʲ I
In Sect. 2.10 , we show that the Bayesian estimate of x k ,
x k , becomes equal to the
¯
L 2 -norm regularized minimum-norm solution if we use
I in the
Gaussian model. Let us derive update equations for the scalar hyperparameters
ʦ = ʱ
I and
ʛ = ʲ
ʱ
and
in this case.
The average data likelihood in this case is given by
ʲ
ʱ 2 E K
NK
2
x k x k
ʘ(ʱ,ʲ) =
log
k =
1
ʲ 2 E K
MK
2
T
+
log
1 (
y k
Hx k )
(
y k
Hx k )
.
k
=
Therefore, using
2 E K
∂ʱ ʘ(ʱ,ʲ) =
NK
2
1
x k x k
=
0
ʱ
k
=
1
and
E K
tr K
x k x k
x k x k )
=
E
(
k
=
1
k
=
1
tr K
ʓ 1
x k
=
1 ¯
x k ¯
+
K
k
=
K
x k ¯
( ʓ 1
=
1 ¯
x k +
K tr
),
k =
we get
NK E K
1
K
K
1
1
N
ʱ 1
x k x k
2
( ʓ 1
=
=
1 ¯
x k
+
tr
)
(B.46)
k
=
1
k
=
ʱ
for the update equation of
.
Search WWH ::




Custom Search