Information Technology Reference
In-Depth Information
Ψ 2 = Σ 22
W 2 W 2
(4)
μ 1 = μ 1
(5)
μ 2 = μ 2
(6)
where M 1 and M 2 are arbitrary matrices such that M 1 M 2 = P q ,being P q a
matrix with the canonical correlations on its diagonal. U 1 q and U 2 q are the first
q canonical directions.
Fig. 3. Generalization of PCCA to C different data sources used in this paper
This model is easily generalizable to C different sources of data. The graphical
model in that case corresponds to the shown on figure 3. Given a set of C different
sources, each source c is generated as:
z n ∼N
(0 ,I q )
x nc ∼N
( W c z n + μ c c )
The maximum likelihood estimates of the parameters are then given by:
W c = Σ cc U cd M c
(7)
Ψ c = Σ cc
W c
W c
(8)
μ c = μ c
(9)
This probabilistic generalization of the CCA model is employed in our system
to combine the feature descriptors extracted from the different views. We choose
the probabilistic interpretation as it would allow us to easily integrate the model
as a part of larger graphical models for action recognition.
4 Hidden Conditional Random Fields
Hidden Conditional Random Fields (HCRF) [14] extend Conditional Random
Fields [8] introducing hidden state variables into the model. A HCRF is an
Search WWH ::




Custom Search