Graphics Reference
In-Depth Information
Fig. 6.1 Restoring eye contact by synthesizing a virtual view in between the surrounding camera
views
Gaze awareness and stereoscopic perception are actually important factors to
provide a high level of realism needed to feel oneself immersed in a natural social
environment [ 33 ]. Unfortunately, since cameras and displays cannot possibly occupy
the same spatial position simultaneously, videoconferencing participants are unable
to look each other in the eyes: a person who stares at the display will be captured by
the cameras as looking away at a slightly diverging angle.
An elegant way to solve this problem is to synthesize a virtual view in between the
surrounding camera views, as if it would be rendered by a camera positioned right
behind the screen, effectively restoring the correct head position and eye contact
[ 25 ], as shown in Fig. 6.1 .
Synthesizing a multitude of such nearby views even supports stereoscopic
(Fig. 6.1 , top right anaglyph) and glasses-free automultiscopic 3D displays (Fig. 6.2 ),
where tens of nearby virtual views are projected in different directions, allowing the
viewer's eyes to capture two parallax-correct images at any position in space for a
natural 3D perception.
This chapter presents a robust method for multicamera view synthesis for eye-
gaze correction and natural 3D video rendering, based on seminal work in plane
sweeping [ 7 , 8 , 13 , 14 ]. Important improvements are proposed to target embedded
vision applications on GPU-accelerated platforms.
Search WWH ::




Custom Search