Image Processing Reference
In-Depth Information
tool to transform large affine and projective mappings, which are nonlinear, to linear
mappings, see Sect. 13.2, in addition to the tensor solution it affords.
To stay within the limits of our scope, the following lemma, which is extensively
discussed elsewhere, e.g. [84], is given without proof.
Lemma 15.3 (SVD). An M
×
K matrix O can be decomposed as
O = UΣV T
(15.39)
K diagonal matrix, 2 and the matrices U , V are quadratic and
where Σ is an M
×
orthogonal.
Note that the matrix O can be, and in practice is, nonquadratic. If we have K>M ,
we have the following decomposition format for O
=
V T
K
O
M
U
M
Σ
M
(15.40)
×
K
×
M
×
K
×
K
whereas if we have K<M the format of the decomposition yields
V T
K
O
M
=
U
M
Σ
M
(15.41)
×
K
×
M
×
K
×
K
In both cases, the matrices O , Σ have the same form, i.e. either they are both “sleep-
ing” as in Eq. (15.40), or they are both “standing” as in Eq. (15.41). Below, we will
qualify a matrix as standing if it has more rows than columns, and sleeping con-
versely.
Assuming that the diagonal elements of Σ are sorted in descending order, σ (11)
··· ≥
σ ( κκ ) , where κ =min( M, K ), and the columns of the matrices O , U , V
are sorted accordingly, one can see that the symmetric, semipositive definite matrix
OO T
can be diagonalized by means of SVD as
OO T
= UΣV T T U T
= UΣΣ T U T
= UΛU T
(15.42)
Because the matrix ΣΣ T is diagonal and has a square form, the eigenvectors of
OO T are to be found in the columns of U , whereas its eigenvalues are the diagonal
elements ΣΣ T . The latter elegantly shows that M
K eigenvalues are automatically
zero when O is a standing matrix because the corresponding ΣΣ T
will have zeros
2 A diagonal matrix is one that has zeros as offdiagonal elements, i.e., the elements of the
diagonal matrix Σ are σ ij δ ( i − j ).
Search WWH ::




Custom Search