Image Processing Reference
In-Depth Information
(4)
which means the first
M
− 1 eigenvectors
λ
and eigenvalues
v
can be obtained by calculat-
ing
WW
T
.
When we have
M
eigenvectors and eigenvalues, the images could be projected onto
L M
dimensions by computing
(5)
where
Ω
is the projected value. Finally, to determine which face provides the best description
of an input image, the Euclidean distance is calculated using Equation
(6)
.
(6)
And finally, the minimum ∈
k
will decide the unknown data into
k
class.
4 Kernel principal component analysis
4.1 KPCA Algorithm
Unlike PCA, KPCA extracts the features of the data nonlinearly. It obtains the principal com-
ponents in
F
which is a high-dimensional feature space that is related to the feature spaces
nonlinearly. The main idea of KPCA is to map the input data to the feature space
F
first us-
ing a nonlinear mapping
Φ
when input data have nonlinearly been mapped, the PCA will be
where
M
is the number of input data. The covariance matrix of
F
can be defined as
(7)
Search WWH ::
Custom Search