Information Technology Reference
In-Depth Information
with ^ HS v ð defined analogously to Eq. ( 4.9 ).
Other possible approximations are available, for example to use synthetic data
produced from the distributions of the sources instead of the data used to learn the
parameters of the ICA mixture model.
Once the entropy is computed, we have to obtain the cross-entropy terms.
Q
M
s v ðÞ
¼ Z p x u
ðÞ dx ¼ Z p x u
p s v i
i ¼ 1
E x u log p x u
½
ðÞ
ðÞ log p x v
ðÞ log
dx
j
det A v
j
ð 4 : 11 Þ
Q
M
p s u i
s u ðÞ
¼ Z p x v
ðÞ dx ¼ Z p x v
i ¼ 1
E x v log p x u
½
ðÞ
ðÞ log p x u
ðÞ log
dx
j
det A u
j
Considering the relationships x ¼ A u s u þ b u ; x ¼ A v s v þ b v and thus s v ¼ A 1
v
ð
A u s u þ b u b v
Þ; we obtain for the first cross-entropy in Eq. ( 4.11 ):
2
Q
P
M
N
s v i s v i ðÞ
h
ae 2
Z p x u ðÞ log p x v
ðÞ dx ¼ Z p s u
i ¼ 1
n ¼ 1
ðÞ log
ds
ð 4 : 12 Þ
j
det A v
j
i
Using Eq. ( 4.12 ), by applying the independence of the sources for the cluster u,
we can obtain:
Z p x u
with s v i being the i-th element of the vector s v ; i.e.,s v i ¼ A 1
ð Þ
A u s þ b u b v
v
j þ Z Y
2
X
M
N
s v i s v i ðÞ
h
ae 2
ðÞ log p x v
ðÞ dx ¼ log det A V
j
i ¼ 1
n ¼ 1
Z p s u M
s ð ds M ... Z p s u 1
2
j þ X
s ðÞ log X
M
N
sv i sv i ðÞ
h
ae 2
ds ¼ log det A V
j
ds 1
i ¼ 1
n ¼ 1
ð 4 : 13 Þ
Again, there is no analytical solution to Eq. ( 4.13 ), so we have to use numerical
alternatives to approximate the cross-entropy. Following the same idea as above
with the entropy, we can use the data corresponding to every source for cluster u in
order to approximate the expectation of Eq. ( 4.13 ). Assuming that we have or can
generate Q i observations according to distribution p s u i
s ðÞ i ¼ 1 ; ... ; M ; we can
estimate
Z p s u M
s ð ds M ... Z p s u 1
s ðÞ log X
N
s v i s v i
ðÞ
ae 2
h
n ¼ 1
2
ð 4 : 14 Þ
X
Q M
... X
Q 1
log X
N
1
Q
s v i s v i ðÞ
h
ae 2
ds 1
M
s M ¼ 1
s 1 ¼ 1
n ¼ 1
Q i
i ¼ 1
Search WWH ::




Custom Search