Database Reference
In-Depth Information
11.4.6
Exercises for Section 11.4
EXERCISE 11.4.1 The SVD for the matrix
is
Find the Moore-Penrose pseudoinverse of M .
! EXERCISE 11.4.2 Find the CUR-decomposition of the matrix of Fig. 11.12 when we pick
two “random” rows and columns as follows:
(a) The columns for The Matrix and Alien and the rows for Jim and John.
(b) The columns for Alien and Star Wars and the rows for Jack and Jill.
(c) The columns for The Matrix and Titanic and the rows for Joe and Jane.
! EXERCISE 11.4.3 Find the CUR-decomposition of the matrix of Fig. 11.12 if the two “ran-
dom” rows are both Jack and the two columns are Star Wars and Casablanca .
11.5 Summary of Chapter 11
Dimensionality Reduction : The goal of dimensionality reduction is to replace a large matrix by two or more other
matrices whose sizes are much smaller than the original, but from which the original can be approximately recon-
structed, usually by taking their product.
Eigenvalues and Eigenvectors : A matrix may have several eigenvectors such that when the matrix multiplies the
eigenvector, the result is a constant multiple of the eigenvector. That constant is the eigenvalue associated with this
eigenvector. Together the eigenvector and its eigenvalue are called an eigenpair.
Finding Eigenpairs by Power Iteration : We can find the principal eigenvector (eigenvector with the largest eigen-
value) by starting with any vector and repeatedly multiplying the current vector by the matrix to get a new vector.
When the changes to the vector become small, we can treat the result as a close approximation to the principal ei-
genvector. By modifying the matrix, we can then use the same iteration to get the second eigenpair (that with the
second-smallest eigenvalue), and similarly get each of the eigenpairs in turn, in order of decreasing value of the ei-
genvalue.
Principal-Component Analysis : This technique for dimensionality reduction views data consisting of a collection of
points in a multidimensional space as a matrix, with rows corresponding to the points and columns to the dimen-
sions. The product of this matrix and its transpose has eigenpairs, and the principal eigenvector can be viewed as the
direction in the space along which the points best line up. The second eigenvector represents the direction in which
deviations from the principal eigenvector are the greatest, and so on.
Search WWH ::




Custom Search