Biomedical Engineering Reference
In-Depth Information
we present the basic math and algorithms necessary to implement isometric virtual
walking, and then show with reference coordinates how limitations of virtual inter-
action space can be alleviated with traveling techniques. In Sect. 10.4 we describe
nonisometric transformations for redirected walking, give an overview of the basic
algorithms with user-centric coordinates, and go into detail on linear and angular
scaling transformations, as well as curvature mappings. We present a simple algo-
rithm that allows practitioners to implement unrestricted redirected walking in VR
workspaces. Section 10.5 concludes the chapter.
10.2 Virtual Reality Workspaces
In order to support real walking, user movements in a VR laboratory have to be
tracked and mapped to motions in a three-dimensional virtual scene. In particular,
movements of the user's head position in the physical workspace have to be measured
and transferred to motions of camera objects in the virtual space in order to provide
ego-centric visual feedback to the user's eyes from the virtual world. 1
Physical workspaces in VR laboratories incorporate tracking systems to measure
the position and/or orientation of objects located in the tracking space. Such track-
ing systems can differ in underlying technology, accuracy and precision of tracking
data, as well as how the user is instrumented. In particular, some VR laboratories
incorporate separate tracking systems for position and orientation measurements,
such as optical marker tracking systems that measure the head position and iner-
tial orientation sensors that measure the head orientation. The coordinate systems in
which tracking systems provide position and orientation data are not standardized,
such that usually the tracking coordinates have to be transformed into the coordi-
nate system used for the virtual scene [ 8 ]. In the following, for convenience, we
assume that virtual and physical coordinate systems are calibrated and represented
in right-handed OpenGL coordinates [ 28 ]. Therefore, the y -axis is oriented in inverse
gravitation direction, whereas the x - and z -axis are orthogonal to the y -axis and each
other, thus defining the ground plane. These coordinates can easily be derived from
arbitrary tracking coordinates by reassigning the x -, y - and z -axes, or multiplying
the z -coordinate with
1 for changing the handedness.
Figure 10.1 illustrates such a coordinate system in a tracked workspace in a
VR laboratory. Position and orientation of tracked objects can be described as a
transformation from a specified origin of the tracking volume to the object's local
coordinate system. Tracking systems often provide position data as a transla-
tion vector
3 , and orientation data as yaw, pitch and roll angles
(
x r ,
y r ,
z r ) ∈ R
3 , describing three subsequently applied rotational transfor-
( ˜
y r , ˜
p r , ˜
r r ) ∈[
0
,
360
)
1 While most immersive VEs implement head tracking for visual feedback, some laboratories also
implement tracking of other body parts to provide virtual body feedback or interaction methods.
 
Search WWH ::




Custom Search