Biomedical Engineering Reference
In-Depth Information
Although most of these systems involve the capture and analysis of frames of
video of the whole body, or optical motion capture systems for tracking discrete
markers on the body, the Lightfoot interface adopted a somewhat different approach,
using an array of interruptible lasers to capture interactions between the feet of users.
In interactive applications, the foot movements of dancers were used to control the
real-time synthesis of musical audio feedback [ 11 ].
17.3.2 Contact Sensing
Contact sensing via floor surfaces requires themeasurement of forces, areas, or occur-
rences of physical contact between the body and the floor. This can be accomplished
using surface mounted force sensing arrays, via force sensors embedded within the
structure of elements of the floor itself (in the manner of a force measurement
plate or scale), via optical measurement of the foot-floor contact region(s), or via
other surface-based contact sensing techniques, such as those based on capacitance,
acoustic waves, or piezoelectric effects. Direct tactile sensing for interaction with
floor surfaces is often accomplished with surface-mounted force sensing arrays—as,
for example, in the Z-Tiles project, Magic Carpet project, ORL Active Floor project,
and others [ 1 , 24 , 28 , 31 , 32 ]. Such interfaces have been employed in applications
including person tracking, activity tracking, or musical performance. Floor-mounted
tactile sensing arrays are commercially available, but for large surfaces areas, costs
are high and support for real-time interaction is often not offered commercially, since
the predominant application areas involve offline gait and posture measurement that
do not require such features.
Awide range of ambient computing tasks have served tomotivate the development
of several of these systems. Orr, Abowd, and Addlesee [ 1 ] developed an activity-
aware smart home environment based, in part, on a floor surface (the ORL Active
Floor) that captured foot-floor forces during normal activities of daily living. The
Ubi-Floor allowed users to access context-aware multimedia applications selected
via a footstep-activated menuing system [ 9 ]. Headon developed a system for inter-
acting with games via full-body movements or gestures sensed via a floor surface.
Input gestures were recognized using statistical classification of temporal patterns of
force measurements acquired through force-sensing load cells embedded in the floor
structure [ 14 ]. Commercially available sensing pads for video games have been used
to implement novel human-computer interactive scenarios, such as the navigation of
heritage sites presented in a virtual environment [ 10 ].
Steinicke et al. have investigated several scenarios associated with the use of
combinations of floor-sensed body posture and hand gestures to navigate virtual
environments [ 8 , 35 ]. In one scenario, they employed the Nintendo Wii Balance
Board interface in tandem with a manually operated touch-sensitive interface to
allow users to navigate within a 3D model of a city presented in a video-projected
virtual environment simulator.
Several research groups have also studied the acquisition and analysis of iner-
tially sensed movements and foot-ground force profiles (ground reaction forces) for
Search WWH ::




Custom Search