Biomedical Engineering Reference
In-Depth Information
Such interfaces give users a locomotion experience that is close to natural walking
in the real world. Chapter 9 of this volume, Technologies of Locomotion Interface,
describes mechanically-assisted walking interfaces such as treadmills and cycles.
This chapter is about stepping-driven interfaces that are not mechanically assisted.
In walking-in-place (WIP) interfaces , users make stepping motions but do not physi-
cally move forward. Sensor data, captured from the user's in-place stepping motions
and other sensors, are used to control the movement of the user's viewpoint through
the virtual scene. The primary technical challenge in WIP systems is controlling the
user's speed so that it is both responsive and smooth; direction can be set with any of
a number of techniques. Using the taxonomy in Bowman et al. [ 4 ], WIP is a hybrid
interface: physical because the user makes repeated movements, and virtual because
the user does not move through physical space.
In real-walking interfaces , a purely physical interface in Bowman et al.'s taxonomy,
users really walk to move through the virtual scene and the physical (lab) environ-
ment. The easy case is when the virtual scene fits within the lab: There is a one-to-one
mapping between the change in the user's tracker-reported pose (position and orien-
tation) and the change in viewpoint for each frame. Speed and direction are controlled
by how fast and in what direction the user moves. This is just as in natural walking .
The more difficult real-walking case is when the virtual scene is larger than the lab:
The mapping between changes in tracker-reported pose and changes in viewpoint
can no longer be one-to-one if the user is to travel to areas in the virtual scene that lie
outside the confines of the lab. Thus, the primary technical challenge in real-walking
interfaces for large scenes is modifying the transform applied to the viewpoint (or
scene) so that the user changes her real, physical direction in a way that keeps
her path through the virtual scene within the physical lab space. Recent locomotion
taxonomies have added categories for new real-walking techniques: Arns' taxonomy
includes interfaces using scaled rotation and/or scaled translation [ 1 ] and Wendt's
taxonomy includes interfaces that recenter users via redirection techniques [ 40 ].
In this chapter we discuss only stepping-driven locomotion interfaces for virtual
scenes that are larger than the lab's tracked space. The locomotion interface tech-
niques reported here were developed for IVE systems that use tracked head-mounted
display devices (HMDs). With some adaptation, walking-in-place can be used in
single- or multi-wall projection display systems. Redirected-walking, one of the
techniques for real-walking in large scenes, has also been employed in multi-wall
display systems [ 28 ]. The interfaces described here do not require stereo-viewing.
Research has shown that locomotion interfaces that require the user to make stepping
movements induce a higher sense of presence, are more natural, and enable better
user navigation than other interfaces [ 22 , 35 ]. These benefits make stepping-driven
interfaces a worthy subject of study. We conclude this introduction with general goals
for locomotion interfaces in IVEs and specific goals for setting locomotion speed
and direction. We then discuss walking-in-place and real-walking virtual locomotion
interfaces in depth.
 
Search WWH ::




Custom Search