Digital Signal Processing Reference
In-Depth Information
Table 12.1
Subspace Learning Methods: An overview
Categories
Methods
Reconstructive Methods
Principal Component Analysis (PCA)
Independent Component Analysis (ICA)
Non negative Matrix Factorization (NMF)
Discriminative methods
Linear Discriminant Analysis (LDA)
Canonical Correlation Analysis (CCA)
Oriented Component Analysis (OCA)
Relevant Component Analysis (RCA)
updating of data which makes reconstructive methods suitable for these applications.
Principal component analysis (PCA) [ 10 ], Independent component analysis (ICA)
[ 11 ] and Non-negative Matrix Factorization (NMF) [ 12 ] are common reconstructive
subspace learning methods. Principal component analysis is a multivariate data analy-
sis procedure that involves a transformation of a number of possibly correlated data
into a smaller number of uncorrelated data called principal components. Independent
component analysis performs a linear transform which makes the resulting variables
as statistically independent from each other as possible. Non-negative matrix factor-
ization finds a linear representation of non-negative data. If there is a non-negative
data V , NMF finds an approximate factorization V
=
WH into non-negative factors
W and H in which W and H must be non-negative.
Discriminant subspace learning: This method allows well separation of data and
hence, provide good classification. The discriminant methods are supervised tech-
niques. They are task dependent and are computationally efficient. Linear Discrim-
inant Analysis (LDA) [ 13 ], Canonical Correlation Analysis (CCA) [ 14 ], Oriented
Component Analysis (OCA) and Relevant Component Analysis (RCA) [ 15 ][ 16 ]are
the common discriminant methods. LDA projects the data onto a lower-dimensional
vector space such that it maximizes the between classes variance and minimizes the
within class variance. Canonical Correlation Analysis (CCA) is a method of corre-
lating linear relationships between two multidimensional variables, one representing
a set of independent variables and the other a set of dependent variables. CCA can
be seen as the problem of finding basis vectors for two sets of variables such that
the correlations between the projections of the variables onto these basis vectors are
mutually maximized. Oriented component analysis (OCA) maximizes the signal-
to-signal ratio between two random vectors. OCA is used in dimension reduction.
Relevant component analysis (RCA) is motivated by a frequently encountered prob-
lem, namely that there is variability in the original data representation that is not
relevant to the task but this will reduce the quality of the results.
Table 12.1 shows the common methods of reconstructive and discriminant sub-
space learning. In this paper, we present background modeling methods by using
PCA and ICA; face and object recognition using LDA and finally, object tracking
using co-variance matrix.
 
Search WWH ::




Custom Search