Information Technology Reference
In-Depth Information
2.4
Custom Events
Next to location, posture and touch, custom events can be specifically defined and
implemented in order to augment the gesture acquisition process with regards to
accuracy, level of precision or in order to sustain a custom experience. They usually
involve additional sensing equipment that needs to be held or worn. For example,
Saponas et al. [41] introduced the concept of muscle-computer interfaces that al-
low gesture-based input by interpreting forearm electromyography with recognition
rates of 86% for pinching with one of three fingers [42]. In a much earlier study,
Harling and Edwards [16] presented a technique for segmenting the command part
of a gesture by using hand tension as the segmentation cue. The investigation started
with the observation that the hand is more relaxed as it moves through two bound-
ary (start and stop) postures. Hand tension was defined as the sum of finger tensions
while each finger was modeled using elastic springs following Hooke's law.
Various devices that are held or manipulated can be used in order to introduce and
take advantage of specific events. One example is the Wii Remote
1
that incorporates
accelerometers as well as one infrared camera in order to allow motion sensing. Be-
sides motion analysis, the controller presents additional buttons which may be used
in order to specify click-like events. The Wii Remote can be held and manipulated
or it can be placed in a fixed location while IR LEDs are controlled instead [30].
Such off-the-shelf devices can even be enhanced for additional sensing. For exam-
ple, Kry et al. [27] describe a special device constructed on top of a SpaceNavigator
by using pressure sensors that detect forces applied by finger tips. This adds pos-
ture recognition to position and orientation in order to better control the actions of a
virtual hand in the virtual environment.
It needs to be mentioned that a distinct category of events is becoming more and
more noticeable: the events used in brain interfaces [36, 43] that rely on various
forms of brain activity analysis and for which implementations have been demon-
strated even for touch surfaces [63]. A variety of technologies exist for detecting
different forms of brain activity and events such as Magnetic Resonance Imagin-
ing (MRI), functional MRI (fMRI), Magneto-encephalography (MEG), etc. Out of
these, electroencephalograph (EEG) systems provide good recognition accuracy de-
spite being low-cost [29]. They make use of electrodes that are placed on the human
scalp in some predefined configuration such as the 10-20 international standard. The
acquired EEG signals are then analyzed using pattern recognition techniques.
3
Gestures as Events in the Human-Computer Dialogue
This section explores how the feedback of a correctly detected and recognized ges-
ture is interpreted by users as an important event in the human-computer commu-
nication process. As for the human-human dialogue, the gestures of one participant
receive corresponding feedback which is interpreted cognitively as well as emotion-
ally. The reason why this happens can be explained by anthropomorphism which, as
1
http://wii.com/
Search WWH ::
Custom Search