Image Processing Reference
In-Depth Information
Lemma 15.1. Let O =[ f 1 ,
···
, f K ] , where f k is a vector in C M . The ON basis
B N =[
ψ 1 ,
···
,
ψ N ] that minimizes
K
N
1
K
f k 2 ,
f k =
f k
with
f k ( m )
ψ m ,
(15.25)
m
k
for every integer N : N<M , is given by the first N eigenvectors of the scatter
matrix:
K
1
K
1
K OO H
f k f k =
S =
(15.26)
k =1
where the eigenvalues are sorted as λ (1) ≥ ··· ≥
λ ( M ) . The new coordinates are
given by
O T
O =[ f 1 ,
, f N ] .
= O T B N , with
···
(15.27)
Now we consider the following problem. Assuming that the data has its mass
center in the origin, we wish to search for N = M
1 basis vectors to approximate
the observations O in the TLS sense. The problem can be solved by application of the
lemma, evidently. Alternatively, we can conceive it as a hyperplane-fitting problem:
f T
ψ
=0
(15.28)
where
1)-dimensional hyperplane passing
through the origin in E m . Given the observations O =[ f 1 ,
ψ
is the normal of an unknown ( M
, f K ], which contain
noise, the equation will not be satisfied exactly but can be solved in the TLS sense,
by searching for a
···
ψ
minimizing
1
K
1
K ψ
O T
ψ 2 =
T OO T
ψ
,
where
ψ
=1 ,
(15.29)
which is solved by the least significant eigenvector of OO T . When the data are
projected onto this plane by
f ψ
, f ψ
(15.30)
we have thus reduced its dimension by 1. The error in the hyperplane approximation
is λ M , the least eigenvalue of S = K OO T . Naturally, one could fit planes recur-
sively to eliminate more and more dimensions until reaching any desired dimension
N . Accordingly, we conclude that although conceptually different, the dimension
reduction and the direction estimation are equivalent mathematically, because both
are solved by an eigen-analysis of the same scatter matrix. We summarize this as a
lemma.
Lemma 15.2 (Direction and PCA). A solution
ψ
of a homogeneous equation
O T
ψ
= 0 ,
(15.31)
 
Search WWH ::




Custom Search