Graphics Reference
In-Depth Information
a projective transformation is determined by its values on any four points, no three
of which are collinear. In the next chapter we'll see analogous results for 3-space,
and in the following one we'll see how to use these theorems to build a library for
representing transformations so that you don't have to spend a lot of time building
individual matrices.
Even though matrices are not as easy for humans to interpret as “This trans-
formation sends the points A , B , and C to A , B , and C ,” the matrix representation
of a transformation is very valuable, mostly because composition of transforma-
tion is equivalent to multiplication of matrices; performing a complex sequence
of transformations on many points can be converted to multiplying the points'
coordinates by a single matrix.
10.16 Exercises
Exercise 10.1: Use the 2D test bed to write a program to demonstrate windowing
transforms. The user should click and drag two rectangles, and you should com-
pute the transform between them. Subsequent clicks by the user within the first
rectangle should be shown as small dots, and the corresponding locations in the
second rectangle should also be shown as dots. Provide a Clear button to let the
user restart.
by the expression given in Equa-
tion 10.17 for its inverse to verify that the product really is the identity.
Exercise 10.3: Suppose that M is an n
Exercise 10.2: Multiply M = ac
bd
×
n square matrix with SVD M =
UDV T .
(a) Why is V T V the identity?
(b) Let i be any number from 1 to n . What is V T v i , where v i denotes the i th column
of V ? Hint: Use part (a).
(c) What's DV T v i ?
(d) What's Mv i in terms of u i and d i ,the i th diagonal entry of D ?
(e) Let M = d 1 u 1 v 1 +
+ d n u n v n . Show that M v i = d i u i .
...
, n are linearly independent, and thus span R n .
(g) Conclude that w Mw and w M w agree on n linearly independent
vectors, and hence must be the same linear transformation of R n .
(h) Conclude that M = M . Thus, the singular-value decomposition proves the
theorem that every matrix can be written as a sum of outer products (i.e., matrices
of the form vw T ).
Exercise 10.4: (a) If P , Q , and R are noncollinear points in the plane, show
that Q
(f) Explain why v i , i = 1,
...
P are linearly independent vectors.
(b) If v 1 and v 2 are linearly independent points in the plane, and A is any point in
the plane, show that A , B = A + v 1 and C = A + v 2 are noncollinear points. This
shows that the two kinds of affine frames are equivalent.
(c) Two forms of an affine frame in 3-space are (i) four points, no three coplanar,
and (ii) one point and three linearly independent vectors. Show how to convert
from one to the other, and also describe a third possible version (Three points and
one vector? Two points and two vectors? You choose!) and show its equivalence
as well.
Exercise 10.5: We said that if the columns of the matrix M are v 1 , v 2 ,
P and R
, v k
R n , and they are pairwise orthogonal unit vectors, then M T M = I k ,the k
...
×
k
identity matrix.
 
 
Search WWH ::




Custom Search