Information Technology Reference
In-Depth Information
Fig. 2.2 InfoMax principle:
mixing, unmixing, and
nonlinear transformation
procedure of deterministic algorithms can exploit the algebraic structure of the
matrices involved. The components are extracted using two methods. The first
method consists of extracting sources source by source (deflation method), i.e.,
s i ¼ b i x; the second one consists of extracting all the sources simultaneously
(symmetric method). The contrasts corresponding to these methods are called one-
unit
(one
component)
and
multi-unit
(several
or
all
components)
contrast
functions.
We selected some of the most representative ICA algorithms (InfoMax [ 32 , 23 ],
JADE [ 33 ], FastIca [ 20 , 22 ], and TDSEP [ 34 ]) derived from different perspectives
of contrast design (entropy-, moment/cumulant-, and correlation-based methods).
These algorithms will be used in comparisons with the techniques proposed in this
work. A brief review of the selected ICA algorithms is included below.
2.2.1 InfoMax
The InfoMax algorithm was proposed in [ 32 ]. The InfoMax principle consists of
maximizing the output entropy of a system z ¼ g ð s Þ¼ g ð Bx Þ with respect to the
demixing matrix B, where g is a nonlinear transformation (see Fig. 2.2 ).
The system shown in Fig. 2.2 can be considered as a neural network. The goal
is to obtain the ICA parameters for an efficient flow of the information in the
neural network. This requires maximizing the mutual information between the x
inputs and the s outputs. It can be demonstrated that under no noise assumption,
the maximization of this mutual information is equivalent to the maximization of
the joint (output) entropy [ 35 ].
The transformation g ðÞ is a R n ! R n
component-wise, non-linear function
that
operates
on
the
sources
estimated
by
the
system
linear
part,
i.e.,
g ð s Þ
i ¼ g i ð s Þ 1 i n : Thus, the InfoMax contrast function is defined as
/ I ð B Þ¼ H gB ðÞ
½
ð
Þ
ð 2 : 10 Þ
where H ðÞ is the differential entropy. Scalar functions g i ; ... ; g n are taken to be
''squashing functions'' that are capable of mapping a wide input domain to a
narrow output domain (0, 1), and to be monotonously increasing. The entropy
output entropy is estimated as [ 5 ]
Þ X
i
þ log j det B j
E log g i
b i x
H gB ðÞ
ð
ð 2 : 11 Þ
Search WWH ::




Custom Search