Biomedical Engineering Reference
In-Depth Information
are often called whitened signals . As explained above, the mixing matrix linking the
whitened signals with the sources reduces to the unitary transformation in Eq. ( 3.21 ).
In consequence, even if PCA does not generally do the job, it does at least half of
it in a computationally affordable manner, as it is based on second-order statistics
and standard matrix decompositions such as the EVD or the SVD (Sects. 3.2.2.1
and 3.2.2.4 ).
3.3.3
Beyond PCA: ICA
We have just seen that if the mixing matrix reduces to an orthogonal matrix Q ,
then the covariance of x = Qs does not depend on the mixing matrix at all,
and PCA fails to perform its identification. By contrast, independent component
analysis (ICA), a statistical tool for transforming multivariate data into independent
random variables [ 7 ], is able to identify any full column rank mixing matrix under
rather general conditions summarized later in this section. Second-order statistics
are not sufficient to account for statistical independence, as illustrated by the
inability of PCA to perform the separation in the general case. Through the use of
second-order statistics, PCA implicitly assumes that the principal components have
Gaussian distributions and it yields indeed the maximum-likelihood estimate of the
separating matrix for uncorrelated Gaussian sources in noiseless scenarios. Hence,
ICA exploits, either explicitly or implicitly, deviations from Gaussianity. This can be
done with the help of optimization criteria based on statistical tools such as entropy,
mutual information, or cumulants, as described next.
3.3.3.1
Statistical Tools
A Gaussian probability density function is entirely characterized by its mean and
variance, i.e., its moments of order 1 and 2 only. Hence, a simple intuitive way to
measure deviations from Gaussianity is via moments of order higher than two. The
r th-order moment of a real random variable z is defined as μ ( r ) =E
{z r }
.Inthe
multivariate case, the set of second-order moments of random vector z ∈R M can
be stored in its covariance matrix, with elements [ R z ] ij =E
,asdefinedin
matrix form by Eq. ( 3.2 ). Similarly, the ( M × M × M × M ) array containing all
fourth-order moments can be defined as: μ ijk =E
{z i z j }
. Yet if vector z is
Gaussian then this moment can be expressed as a function of moments of order 1
and 2 only. If we assume for simplicity that z is zero-mean, then it can be shown
that
{z i z j z k z }
μ ijk = R ij R k + R ik R j + R i R jk ,
which reduces in the scalar case to the well known relation μ (4) =3 μ
2
(2) . It follows
that a natural way to measure deviation from Gaussianity of a random vector z
consists of computing the so-called fourth-order cumulant :
Search WWH ::




Custom Search