Graphics Reference
In-Depth Information
3
4
2
= 4
.
/
54
/
5
(10.39)
4
/
53
/
5
2
Verify for yourself that these really are the u -coordinates of v , that is, that the
vector v really is the same as 4 u 1 +(
2 ) u 2 .
10.3.7 Matrix Properties and the Singular Value
Decomposition
Because matrices are so closely tied to linear transformations, and because lin-
ear transformations are so important in graphics, we'll now briefly discuss some
important properties of matrices.
First, diagonal matrices—ones with zeroes everywhere except on the diag-
onal, like the matrix M 2 for the transformation T 2 —correspond to remarkably
simple transformations: They just scale up or down each axis by some amount
(although if the amount is a negative number, the corresponding axis is also
flipped). Because of this simplicity, we'll try to understand other transformations
in terms of these diagonal matrices.
Second, if the columns of the matrix M are v 1 , v 2 ,
R n , and they are
...
, v k
pairwise orthogonal unit vectors, then M T M = I k ,the k
k identity matrix.
In the special case where k = n , such a matrix is called orthogonal. If the
determinant of the matrix is 1, then the matrix is said to be a special orthogonal
matrix. In R 2 , such a matrix must be a rotation matrix like the one in T 1 ;in R 3 ,the
transformation associated to such a matrix corresponds to rotation around some
vector by some amount. 1
Less familiar to most students, but of enormous importance in much graph-
ics research, is the singular value decomposition (SVD) of a matrix. Its exis-
tence says, informally, that if we have a transformation T represented by a matrix
M , and if we're willing to use new coordinate systems on both the domain and
codomain, then the transformation simply looks like a nonuniform (or possibly
uniform) scaling transformation. We'll briefly discuss this idea here, along with
the application of the SVD to solving equations; the web materials for this chapter
show the SVD for our example transformations and some further applications of
the SVD.
The singular value decomposition theorem says this:
Every n
×
×
k matrix M can be factored in the form
M = UDV T ,
(10.40)
where U is n
×
r (where r =min( n , k ) ) with orthonormal columns, D is r
×
r
diagonal (i.e., only entries of the form d ii can be nonzero), and V is r
×
k with
orthonormal columns (see Figure 10.8).
By convention, the entries of D are required to be in nonincreasing order (i.e.,
|
) and are indicated by single subscripts (i.e., we write
d 1 instead of d 1,1 ). They are called the singular values of M . It turns out that
M is degenerate (i.e., singular) exactly if any singular value is 0. As a general
d 1,1 |≥|
d 2,2 |≥|
d 3,3 |...
1. As we mentioned in Chapter 3, rotation about a vector in R 3 is better expressed as
rotation inaplane, so instead of speaking about rotation about z , we speak of rotation
in the xy -plane. We can then say that any special orthogonal matrix in R 4 corresponds
to a sequence of two rotations in two planes in 4-space.
 
 
Search WWH ::




Custom Search