Environmental Engineering Reference
In-Depth Information
k
value of the log of the copula density, can be
computed, or estimated, from a parametric copula
In the PCA/ICA literature, contrast functions
are objective functions for source separation: let
ψ ( Y = 0 imply Y i and Y j are independent ∀ i
j —then ψis a particular contrast function.
The minimization of functions of these types is
the essence of the PCA/ICA algorithm.
Essentially, this approach demonstrates a role
for the copula as the apparatus for these contrast
functions, which exploits its natural appearance
in measures of association, here the mutual infor-
mation, and as a model for dependence/indepen-
dence. This is choosing the mutual information
as the engine for the ICA contrast function. This
is a special case the component analysis problem
via minimization of a \ parametric probability
distance. This yields symmetry with the prin-
ciples of likelihood maximization and employs a
decomposition of the Kullback-Liebler distance.
=
1
K (
dF
,
dF
)
MI
( )
X
(7)
i
i
=
A classic property of (7) is its decomposability
*
*
K
( , )
y s
=
K
( ,
y y
)
+
K
(
y s
, ).
(8)
with y * a random vector with independent entries
and margins distributed as Y ; S is an independent
vector.
In the component analysis procedure—with
y the outputs and S the unobserved sources—the
total distance between the model and the outputs
is decomposed into the deviation from indepen-
dence of the outputs K ( ,
*
Y y and the mismatch
of the marginal distributions K (
)
Y s .
, )
+
=
Marginal
Total
Mismatch
Deviation from
Independe
nce
Mismatch
Kullback-Liebler as
Dependence Distance
(9)
*
*
= G G our best estimate for
the marginal distributions of y --- where y * is still
a random, mutually independent vector with
margins distributed equivalently with y .
Thus, u * is independent with margins distrib-
uted as y . Then the KL distance is:
Setting u
(
y
)
The Kullback-Liebler [Kullback 1959] divergence
between two probability density functions f ( )
t
and g ( t we notate
f
g
( )
( ) )
t
t
= t
K ( ,
f g
)
f
( )
t
log
(
(6)
·
·
·
*
*
K
( , )
u u
=
K
( ,
u u
)
+
K
(
u u
, )
(10)
between two probability density functions, f ( )
t
with · the estimate of the true sources.
and g ( t .
The mutual information is a special instance of
the Kullback-Liebler (K-L) probability distance
between independence and dependence.
If X k is k −dimensional multivariate with
density function dF and marginal distributions
dF
The CICA Algorithm: Full Model,
via Estimating Equations
This approach yields estimating equations , equa-
tions for the parameters of the component analy-
sis model. In this full CICA method - we derive
estimating equations for the mixing parameter B
dF k
then
1 , ...,
Search WWH ::




Custom Search