Digital Signal Processing Reference
In-Depth Information
Figure 5.3
Applying ICA to a random vector x = As that does not fulfill the ICA model; here
s is chosen to consist of a two-dimensional and a one-dimensional irreducible
component. Shown are the statistics over 100 runs of the Amari error of the random
original and the reconstructed mixing matrix using the three ICA algorithms
FastICA, JADE, and Extended Infomax. Clearly, the original mixing matrix could
not be reconstructed in any of the experiments. However, interestingly, the latter
two algorithms do indeed find an ISA up to permutation, which can be explained
by theorem 5.2.
groups S i in advance. Of course, some restriction is necessary; otherwise,
no decomposition would be enforced at all. The key idea in [251], is to
allow only irreducible components defined as random vectors without
lower-dimensional independent components.
The advantage of this formulation is that it can clearly be applied
to any random vector, although of course a trivial decomposition might
be the result in the case of an irreducible random vector. Obvious inde-
terminacies of an ISA of x are scalings (i.e. invertible transformations
within each s i ) and permutation of s i of the same dimension. These are
already all indeterminacies, as shown by theorem 5.3.
Theorem 5.3 Existence and Uniqueness of ISA: Given a ran-
dom vector X with existing covariance, an ISA of X exists and is unique
except for permutation of components of the same dimension and invert-
ible transformations within each independent component and within the
Gaussian part.
Here, no Gaussians had to be excluded from S (as in the previous
uniqueness theorems), because a dimension reduction results from [104,
251] can be used. The connection of the various factorization models and
Search WWH ::




Custom Search