Database Reference
In-Depth Information
! EXERCISE 11.2.2 Prove that if M is any matrix, then M T M and MM T are symmetric.
11.3 Singular-Value Decomposition
We now take up a second form of matrix analysis that leads to a low-dimensional repres-
entation of a high-dimensional matrix. This approach, called singular-value decomposition
(SVD), allows an exact representation of any matrix, and also makes it easy to eliminate the
less important parts of that representation to produce an approximate representation with
any desired number of dimensions. Of course the fewer the dimensions we choose, the less
accurate will be the approximation.
We begin with the necessary definitions. Then we explore the idea that the SVD defines
a small number of “concepts” that connect the rows and columns of the matrix. We show
how eliminating the least important concepts gives us a smaller representation that closely
approximates the original matrix. Next, we see how these concepts can be used to query
the original matrix more efficiently, and finally we offer an algorithm for performing the
SVD itself.
11.3.1
Definition of SVD
Let M be an m × n matrix, and let the rank of M be r . Recall that the rank of a matrix is the
largest number of rows (or equivalently columns) we can choose for which no nonzero lin-
ear combination of the rows is the all-zero vector 0 (we say a set of such rows or columns
is independent ). Then we can find matrices U , Σ, and V as shown in Fig. 11.5 with the fol-
lowing properties:
(1) U is an m × r column-orthonormal matrix ; that is, each of its columns is a unit vector
and the dot product of any two columns is 0.
(2) V is an n × r column-orthonormal matrix. Note that we always use V in its transposed
form, so it is the rows of V T that are orthonormal.
(3) Σ is a diagonal matrix; that is, all elements not on the main diagonal are 0. The ele-
ments of Σ are called the singular values of M .
Search WWH ::




Custom Search