Graphics Reference
In-Depth Information
commonly used as a starting point for gesture recognition, which may in turn be an
important element of human-robot interaction. Hence, this section provides a short
overview of gesture recognition methods, including pointing gestures and interac-
tions with objects.
7.1.2.1 The Role of Gestures in Human-Robot Interaction
A broad overview of the importance of gestures as a 'communication channel' or
'modality' for the interaction between humans and robots is given by Hofemann
( 2007 ). Accordingly, classical seminal works are those by Turk ( 2005 ), who em-
phasises the general high relevance of non-verbal modalities such as gestures in
the field of human-robot interaction, Pavlovic et al. ( 1997 ), and Nehaniv ( 2005 ).
Pavlovic et al. ( 1997 ) suggest a subdivision into communicative gestures and ma-
nipulative gestures. They define a gesture as the temporal behaviour of the time-
dependent vector of model parameters over a certain period of time. Nehaniv ( 2005 )
introduces five classes of gestures relevant for human-robot interaction, including
(i) ' “irrelevant”/manipulative gestures', such as unintentional arm movements while
running; (ii) 'side effects of expressive behaviour', e.g. hand, arm, and face move-
ments during a conversation; (iii) 'symbolic gestures' having a clear interpretation,
such as nodding; (iv) 'interactional gestures' applied e.g. for beginning or termi-
nating an interaction; and (v) 'referential/pointing gestures', also known as deictic
gestures, such as pointing to an object in order to provide a reference while commu-
nicating.
It is shown in the following sections that the recognition of the movements and
postures of humans, especially their hands and arms, in the context of manipulative
gestures is also highly relevant in the context of systems that safeguard human-robot
interaction in the industrial production environment. Hence, the algorithms required
for such systems are closely related to those used for gesture recognition.
7.1.2.2 Recognition of Gestures
Black and Jepson ( 1998 ) model the movements of human body parts as 'temporal
trajectories', i.e. the behaviour of a set of pose parameters. These trajectories are
characteristic for specific gestures. For the adaptation of trajectory models to ob-
served data, Black and Jepson ( 1998 ) use an extension of the CONDENSATION
algorithm (Blake and Isard, 1998 ). The trajectories are represented by the time-
dependent velocity in the horizontal and vertical directions. The trajectory-based
approach of Black and Jepson ( 1998 ) has inspired many later works. As an exam-
ple, joint angle trajectories inferred from the monocular, kernel particle filter-based
body pose estimation system of Schmidt et al. ( 2006 ) are utilised by Hofemann
( 2007 ) for gesture recognition.
In contrast, in the classical work by Campbell et al. ( 1996 ), measurements of the
motion of features belonging to the head and the hand are used to recognise Tai Chi
Search WWH ::




Custom Search