Image Processing Reference
In-Depth Information
S OLUTION
The eigenvalues of A are 3 and 7. Both are positive, hence matrix A is positive
de
nite matrix. Eigenvalues of matrix B are
2.2361 and 2.2361, therefore B is
inde
nite; and
finally, the eigenvalues of matrix C are
2 and
10 that makes it
negative de
nite.
3.10 SINGULAR VALUE DECOMPOSITION
One of the most important tools in signal processing and numerical linear algebra is
the SVD. The SVD was discovered for square matrices by Beltrami and Jordan in the
eighteenth century. The theory for general matrices was established by Eckart and
Young. We
first state the SVD theorem and then we look at its applications.
THEOREM 3.2
Let A be an m n real or complex matrix with rank r, then there exist unitary
matrices U ( m m )
and V ( n n )
such that
A ¼ U S V H
(
:
)
3
136
where
S
is an m n matrix with entries
s i
if
i ¼ j
S ij ¼
(
3
:
137
)
0
if
i
j
The quantities
s 1 s 2 s r > s r þ 1 ¼ s r þ 2 ¼¼s n ¼
0 are called sin-
gular values of A.
Proof: Let S ¼ A H A. Matrix S is n n Hermitian and positive semi-de
nite
with rank r. Therefore,
it has nonnegative eigenvalues. Let
the eigenvalues
1
2
2
2
r þ
2
r þ
2
and eigenvectors of S be
s
s
s
r > s
¼ s
¼¼s
n ¼
0
1
2
and
v 1 , v 2 ,
...
, v n . These
eigenvectors
form an
orthonormal
set. Let
V 1 ¼ v 1 v 2
½
v r
, V 2 ¼ v r þ 1 v r þ 2
½
v n
,and
L¼ diag(s 1 ,
s 2 ,
...
,
s r )
, then
A H AV 1 ¼ V 1 L
2
(
3
:
138
)
Pre-multiply both sides of Equation 3.138 by V 1
followed by post- and pre-multi-
L 1
plication by
results in
L 1 V 1 A H AV 1 L 1
¼ I
(
3
:
139
)
Choose U 1 ¼ AV 1 L 1 , then by Equation 3.139 we have U 1 U 1 ¼ I. Notice that U 1 is
a unitary matrix of size m r. Choose U 2 to be another unitary matrix of size
m ( m r )
orthogonal to U 1 . Then
Search WWH ::




Custom Search