Image Processing Reference
In-Depth Information
among a set of variables is equivalent to maximizing their statistical indepen-
dence. It is possible to show that the mutual information of the estimated
components is
"
log |
gs
ps
()|
()
#
I
()
s
=−
H E
( ())
g
s
+
i
i
(16.43)
!
$
i
i
i
where p i ( s i ) is the probability density function of the independent component s i . It is
important to choose the proper shape of the nonlinearities g i such that
This means that it is necessary to have some prior knowledge about the statistical
distribution of the independent components, for example, if they have a super-
Gaussian distribution, such as images with small activation foci in a large number
of voxels, or sub-Gaussian, such as the distribution of time-domain components
that may be strongly related to a block-designed task. The reason why it is not
possible to estimate Gaussian-distributed components will be shown using intu-
itive reasoning afterward. This method was extended to allow the separation of
both super- and sub-Gaussian-distributed components [84]. The weights are esti-
mated using a stochastic gradient descent algorithm [78] such that at each step
the weights are updated following the relations
gs
()
=
ps
( .
i
i
i
i
WW
=
[
T
]
1
+
fWxx
(
)
T
(16.44)
. It can be shown that this algorithm is equivalent to the maximum
likelihood approach for the estimation of components with known distribution
densities. A simplification and optimization of this method was given by Amari
[79], using the natural gradient method, which can be obtained by multiplication
of Equation 16.44 by WW T resulting in
and f i
=
(log( p i ))
WIf WxxWW
=+
[
(
)
TT
]
(16.45)
The algorithm stabilizes when
fWxxW
(
)
TT
=−
I
(16.46)
The minus sign is given by the functions f i . Typical functions are f ( s )
=
2
tanh( s ) for super-Gaussian components and f ( s )
s for sub-Gaussian
components. The relation in Equation 16.46 shows that this method can be viewed
as nonlinear decorrelation, and it is interesting to note that if we perform a Taylor
expansion of the nonlinear functions, we get higher-order correlations of the
variables. These are the measures we have to take into account if we want to
estimate statistical independence.
=
tanh( s )
Search WWH ::




Custom Search