Biomedical Engineering Reference
In-Depth Information
where u 1 ,...,
ʵ
u K are collectively denoted u . The noise
is assumed to be Gaussian
with the mean of zero, i.e.,
, ʛ 1
p
( ʵ ) = N( ʵ |
0
),
(5.6)
where
ʛ
is a diagonal precision matrix. With this assumption, the conditional prob-
ability p
(
y k |
u k )
is expressed as
Au k , ʛ 1
p
(
y k |
u k ) = N(
y k |
).
(5.7)
The noise
ʵ
is also assumed to be independent across time. Thus, we have
K
K
Au k , ʛ 1
p
(
y
|
u
) =
p
(
y 1 ,...,
y K |
u 1 ,...,
u K ) =
p
(
y k |
u k ) =
1 N(
y k |
),
k =
1
k =
(5.8)
where y 1 ,...,
y K are collectively denoted y . Using the probability distributions
defined above, the Bayesian factor analysis can factorize the sensor data into L
independent factor activity and additive sensor noise. This factorization is achieved
using the EM algorithm [ 4 , 5 ]. Explanation of the basics of the EM algorithm is
provided in Sect. B.5 in the Appendix.
5.2.3 EM Algorithm
The E-step of the EM algorithm derives the posterior distribution p
. Deriva-
tion of the posterior distribution in the Gaussian model is described in Sect. B.3 in the
Appendix. Since the posterior distribution is also Gaussian, we define the posterior
distribution p
(
u k |
y k )
(
u k |
y k )
such that
u k , ʓ 1
u k | ¯
p
(
u k |
y k ) = N(
),
(5.9)
¯
ʓ
where
u k is the mean and
is the precision matrix. Using Eqs. (B.24) and (B.25)
with setting
ʦ
to I and H to A in these equations, we get
A T
ʓ = (
ʛ
A
+
I
),
(5.10)
A T
) 1 A T
u k = (
¯
ʛ
A
+
I
ʛ
y k .
(5.11)
The equations above are the E-step update equations in the Bayesian factor analysis.
Let us derive the M-step update equations for the mixing matrix A and the noise
precision matrix
ʛ
. To do so, we derive the average log likelihood,
Θ(
A
, ʛ )
, and
according to Eqs. (B.34), ( 5.5 ), and ( 5.8 ), it is expressed as
 
Search WWH ::




Custom Search