Biomedical Engineering Reference
In-Depth Information
14.4 Redirection in Mixed Reality Environments
Mixed reality experiences that combine elements from the physical and virtual worlds
have also been a focus for training applications, such as the Infantry Immersion
Trainer at the Marine Corps Base Camp Pendleton [ 10 ]. Traditionally, mixed reality
is often used to refer to the visual merging of real and virtual elements into a single
scene. However, in the Mixed Reality Lab at the Institute for Creative Technolo-
gies, we are particularly interested in enhancing virtual experiences using modalities
beyond just visuals, such as passive haptic feedback using real objects that are aligned
with virtual counterparts. Experiments have shown that the ability to reach out and
physically touch virtual objects substantially enhances the experience of the envi-
ronment when using head-mounted displays [ 5 ].
Because redirected walking requires a continuous rotation of the virtual envi-
ronment about the user, it disrupts the spatial alignment between virtual objects
and their real world counterparts in mixed reality scenarios. While researchers have
demonstrated that it is possible to combine redirected walking with passive haptic
feedback, solutions have been limited in their applicability. For example, Kohli et
al. presented an environment that redirects users to walk between multiple virtual
cylindrical pedestals that are aligned with a single physical pedestal [ 7 ]. Unfortu-
nately, this solution does not generalize to other types of geometry that would not
be perceptually invariant to rotation (i.e. non-cylindical objects). Steinicke et al.
extended this approach by showing that multiple virtual objects can be mapped to
proxy props that need not match the haptic properties of the virtual object identi-
cally [ 20 ]. However, due to the gradual virtual world rotations required by redirected
walking, synchronizing virtual objects with corresponding physical props remains a
practical challenge.
Recent research has presented a drastically different approach to redirection that
does not require gradual rotations of the virtual world. This technique, known as
change blindness redirection , reorients the user by applying instantaneous alterations
to the architecture of the virtual world behind the user's back. So long as the user
does not directly observe the manipulation visually, minor structural changes to the
virtual environment, such as the physical orientation of doorways (see Fig. 14.3 ), are
difficult to detect and most often go completely unnoticed. Perceptual studies of this
technique have shown it to provide a compelling illusion—out of 77 users tested
across two experiments, only one person noticed that a scene change had occurred
[ 22 ]. Furthermore, because this technique shifts between discrete environment states,
it is much easier to deploy in mixed reality applications that provide passive haptic
feedback [ 25 ].
Figure 14.4 demonstrates change blindness redirection being used in a mixed real-
ity environment that combines synthetic visuals with a passive haptic gravel walking
surface. In our example application, which was themed as an environment similar to
those that might be used for training scenarios, users are instructed to search for a
cache of weapons hidden in a desert village consisting of a gravel road connecting
a series of buildings (see Fig. 14.4 a). Inside of each building, the location of one of
 
Search WWH ::




Custom Search