Database Reference
In-Depth Information
-0.026106572186134578 -0.016701471378568943 ...
-0.026242986173995755 -0.016254664123732318 ...
-0.02573628754284022 -0.017185663918352894 ...
-0.02545319635905169 -0.01653357295561698 ...
-0.025325893980995124 -0.0157082218373399...
Visualizing the Eigenfaces
Now that we have trained our PCA model, what is the result? Let's inspect the dimensions
of the resulting matrix:
val rows = pc.numRows
val cols = pc.numCols
println(rows, cols)
As you should see from your console output, the matrix of principal components has 2500
rows and 10 columns:
(2500,10)
Recall that the dimension of each image is 50 x 50, so here, we have the top 10 principal
components, each with a dimension identical to that of the input images. These principal
components can be thought of as the set of latent (or hidden) features that capture the
greatest variation in the original data.
Note
In facial recognition and image processing, these principal components are often referred
to as Eigenfaces , as PCA is closely related to the eigenvalue decomposition of the covari-
ance matrix of the original data.
See http://en.wikipedia.org/wiki/Eigenface for more details.
Since each principal component is of the same dimension as the original images, each
component can itself be thought of and represented as an image, making it possible to
visualize the Eigenfaces as we would the input images.
As we have often done in this topic, we will use functionality from the Breeze linear al-
gebra library as well as Python's numpy and matplotlib to visualize the Eigenfaces.
First, we will extract the pc variable (an MLlib matrix) into a Breeze DenseMatrix :
Search WWH ::




Custom Search