Graphics Reference
In-Depth Information
Camera
Light Probe
(a)
(b)
CG objects
CG camera
(viewpoint)
Distant scene
(far field)
Distant scene
Local scene (near field)
(c)
(d)
Figure 7.1
The basic process of capturing a real environment map for IBL. (After [Debevec 98]
c
1998 ACM, Inc. Included here by permisiion.)
3. An omnidirectional HDR radiance map is constructed from the light probe
images. The images from each viewpoint are first converted to an HDR im-
age using Debevec and Malik's technique (see Chapter 6), then these HDR
images are stitched together into an HDR radiance map of the environment
as it appears from the center of the sphere ( Figure 7.1(c) ) .
4. A model of the local environment is constructed. The model only needs to
include the part of the environment close to the CG object. This model is
used to simulate near-field illumination effects such as interreflection and
shadowing. The surface reflectance properties of local objects need to be
known, but only approximately ( Figure 7.1(d) ) .
5. The CG object is added to the local model and rendered using the captured
radiance map as the source of illumination.
6. The rendered object is composited into the photograph from Step 1.
The illumination recovery thus has two elements: the radiance map of the
environment, which represents the far-field illumination, and the construction of
the local model for near field effects. The local model is also needed to include
shadows of the CG object in the scene. The local model construction is com-
paratively more difficult than obtaining the radiance map from the light problem.
 
Search WWH ::




Custom Search