Graphics Reference
In-Depth Information
Of course, the mapping does not typically transform an intermediate image pixel back to the center
of a source image pixel. The floating point coordinates of the source image location could be rounded to
the nearest pixel coordinates, which would introduce aliasing artifacts. To avoid such artifacts in the
intermediate image, the corner points of the intermediate pixel could be mapped back to the source
image, which would produce a quadrilateral area in the source image; the pixels, wholly or partially
contained in this quadrilateral area, would contribute to the color of the destination pixel.
The mapping described so far is an affine transformation. For image pairs to be really interesting
and useful, multiple line pairs must be used to establish correspondences between multiple features in
the images. For a pair of images with multiple feature lines, each feature line pair produces a displace-
ment vector from an intermediate image pixel to its source image position. Associated with this dis-
placement is a weight based on the pixel's position relative to the feature line in the intermediate image.
The weight presented by Beier and Neely [ 2 ] is shown in Equation 4.10 .
b
p
jQ 2 Q 1 j
W
¼
(4.10)
a þ dist
The line is defined by points Q 1 and Q 2 , and dist is the distance that the pixel is from the line. The
distance is measured from the finite line segment defined by Q 1 and Q 2 so that if the perpendicular
projection of P onto the infinite line defined by Q 1 and Q 2 falls beyond the finite line segment, then
the distance is taken to be the distance to the closer of the two endpoints. Otherwise, the distance is the
perpendicular distance to the finite line segment. User-supplied parameters ( a , b , p in Eq. 4.10 ) control
the overall character of the mapping. As dist increases, w decreases but never goes to zero; as a practical
matter, a lower limit could be set belowwhich w is clamped to 0 and the feature line's effect on the point
is ignored above a certain distance. If a is nearly 0, then pixels on the line are rigidly transformed with
the line. Increasing a makes the effect of lines over the image smoother. Increasing p increases the
effect of longer lines. Increasing b makes the effect of a line fall off more rapidly. As presented here,
these parameters are global; for more precise control these parameters could be set on a feature-line-by-
feature-line basis. For a given pixel in the intermediate image, the displacement indicated by each fea-
ture line pair is scaled by its weight. The weights and the weighted displacements are accumulated. The
final accumulated displacement is then divided by the accumulated weights. This gives the displace-
ment from the intermediate pixel to its corresponding position in the source image. See the code seg-
ment in Figure 4.51 .
When morphing between two images, the feature lines are interpolated over some number of
frames. For any one of the intermediate frames, the feature line induces a mapping back to the source
image and forward to the destination image. The corresponding pixels from both images are identified
and their colors blended to produce a pixel of the intermediate frame. In this way, feature-based morph-
ing produces a sequence of images that transform from the source image to the destination image.
The transformations implied by the feature lines are fairly intuitive, but some care must be taken in
defining transformations with multiple line pairs. Pixels that lie on a feature line are mapped onto that
feature line in another image. If feature lines cross in one image, pixels at the intersection of the feature
lines are mapped to both feature lines in the other image. This situation essentially tries to pull apart the
image and can produce unwanted results. Also, some configurations of feature lines can produce non-
intuitive results. Other techniques in the literature (e.g., [ 21 ]) have suggested algorithms to alleviate
these shortcomings.
 
Search WWH ::




Custom Search