Graphics Reference
In-Depth Information
background, they don't go walleyed (that is, the gazes of each eye diverge), which is
very uncomfortable.
When your eyes see something that's wrong in stereo, it sort of “buzzes,” an
uncomfortable feeling. I used it to my advantage in one shot in Transformers: Dark
of the Moon , and I think it's an artistic tool people can use. There's a shot where the
camera's dollying in front of the headlights of a car and the lens flare's very different
in the two eyes. The result is that it sort of hurts to look right into the headlights,
which makes sense!
RJR: What's the typical disparity range for a movie frame?
Beier: In our terminology, objects that appear to lie on the plane of the movie screen
have zero disparity. We say that objects “beyond” the screen have positive disparities,
and objects “in front of” the screen have negative disparities. For a 2K image, we allow
a horizontal disparity range from about negative forty pixels to about positive twenty
pixels. The average person can resolve disparity differences of 0.2 pixels or so; that's
theminimum stereo difference you can recognize. So our total stereo budget is about
60
300 depth levels.
Sometimes you can have an object with ridiculous negative disparity. That's okay
if it's going too fast for you to try to focus on it. For example, if it's just particles going
past your head, you can sort of feel them going past your head, but you don't try to
focus on them and it doesn't become a problem. If it was a big ball going past your
head, that would hurt, because you'd try to focus on it and track it as it goes by.
For effects shots with big robots and spaceships, almost everything in the scene
has positive disparity. This makes sense; if an object's in front of the screen, first, it
hurts to look at, and second, it necessarily looks smaller because it's in the roomwith
us, so it can't be bigger than that. If the object is beyond the screen, it can be as big
as you want it to be. In a shot in space, the stars are your interocular distance apart,
and appear to be at infinity.
×
5
=
RJR: How does the interocular distance change over the course of a shot?
Beier: There have always been people on the set called focus pullers. While the
director of photography or cameraman's looking through the camera, the focus puller
is a second person who either has a hand on the camera or a remote control to set the
focus. Focus pulling is a big job and good focus pullers are well compensated. These
days, there's another person called the convergence puller who has a similar remote
control and can adjust the convergence or interocular during the shot.
It turns out that this is very important in scenes where the camera is moving. For
example, in one shot in Transformers: Dark of the Moon we're on the moon while
robots are emerging from its surface. We start out a long way away from everything,
and the interocular here is probably half a meter, since these robots are about ten
feet tall. Once you we get close to them, the interocular has gone down to maybe one
centimeter. If we didn't do that, then these robots would be so in your face that they'd
be hugely separated in negative disparity space out in front of the camera. Then, we
start to widen the interocular again to give some depth to the scene, otherwise it
would look totally flat. If you don't change the interocular throughout a scene like
Search WWH ::




Custom Search