Graphics Reference
In-Depth Information
requiring a discretization of the entire volume. Many multiscan fusion algorithms
use Poisson surface reconstruction thanks to Kazhdan et al.'s publicly available
implementation. 22
8.5
INDUSTRY PERSPECTIVES
Gentle Giant Studios, in Burbank, California, provides large object, vehicle, and set
scanning for nearly every blockbuster Hollywood movie, and has scanned the faces
and bodies of thousands of actors and performers. Steve Chapman, Gentle Giant's
vice president of technology, discusses the role of LiDAR, structured light, andmulti-
view stereo in visual effects.
RJR: How has the use of LiDAR for large-scale scanning of movie sets changed over the
years?
Chapman: In the 1990s we used a slow triangulation-based LiDAR scanner. At that
point, time-of-flight systems from Cyrax were the size of a dishwasher, so we found
a system developed for the French nuclear power department that was designed
to attach to a radio-controlled robot. Things are quite different now, and speed of
capture has been a primarymotivation for obtaining newer gear. LiDAR doesn't work
well for smaller subjects, though we'll occasionally use it “in a pinch” for cars or
medium-sized objects.
You're usually limited on time when you're working on a movie set. Often there's
a time window where the main film unit has just “wrapped” filming and you have
a short amount of time before they come back to shoot more scenes. On the other
hand, youmight come in at the end of a shoot, when they're about to destroy a set and
build another set in the same place as soon as you finish the scan. There are people
standing around looking at their watches grumbling, “When is this LiDAR thing going
to finish?”
Now that we have phase-based systems, we can scan anything froma human body
to a building, and the process is quick enough that we can hop around and collect a
more comprehensive dataset thanwe couldwith older time-of-flight systems. We can
scan an entire soundstage fromone viewpoint in just a fewminutes and getmore data
than a time-of-flight LiDAR would collect in hours. Because the phase-based system
can read objects at a much closer range than a time-of-flight device, we can start do
things like scan behind small set pieces, and we can simply set up dozens of scan
locations where in the past we'd have only scanned from a handful of viewpoints.
RJR: What kinds of things will amovie production dowith LiDAR data once you deliver
it to them?
Chapman: Early on, fewmovie crews knew what to do with the data. Now, everyone
from set designers, pre-visualization departments, set extension painters, camera
trackers, particle effects artists, and character placement animators are clamoring for
22 http://www.cs.jhu.edu/ misha/Code/PoissonRecon/
 
Search WWH ::




Custom Search