Graphics Reference
In-Depth Information
only to our interlocutors. Most of our gesturing is made without
awareness of attributing a semantic meaning to it, and additionally,
successful human interactions are possible in situations where the
interlocutors cannot see each other (on the telephone for example,
Short et al., 1976; Williams, 1977). Moreover, the understanding of
a sign language requires to be a speaker of such a language (as for
any oral language) and it is difficult (even though possible) to infer
the meaning of a message through pantomime and mute gestural
interactions (Freedman, 1972; Rimé, 1982; Krauss et al., 1991, 2000).
In contrast, other studies have shown that gestures can resolve
speech ambiguities and facilitate comprehension in noisy environment
(Kendon, 2004; Rogers, 1978; Thompson and Massaro, 1986) and in some
interactional contexts are more effective than speech in communicating
ideas (see the mismatches theory in Goldin-Meadow, 2003). More
remarkably, it has been shown that gestures are in semantic coherence
with speech (Kendon, 2004; McNeill, 2005), coordinated with tone units
and prosodic entities, such as pitch-accented syllables and boundary
tones (Yasinnik et al., 2004; Shattuck-Hufnagel et al., 2007; Esposito et
al., 2007) and are loosely synchronized with speech through pausing
strategies (Butterworth and Beattie, 1978; Esposito et al., 2002, 2001;
Esposito and Marinaro, 2007). The abovementioned results suggest that
gestures and speech are partners in shaping and giving kinetic and
temporal (visual and auditory) dimensions to communication, and must
be regarded as an expressive system that, in partnership with speech,
provides communicative means for giving form to our thoughts.
Some of the abovementioned results extend to non-human primates,
which have been shown to be extremely efficient in detecting gestural
and vocal correspondences or discrepancies (Hauser et al., 1993;
Partan, 2002; Ghazanfar and Logothetis, 2003) as well as to benefit of
multimodal signals for basic evolutionary-survival-related needs such
as courtship, aggression, affiliation, detection and localization of preys
and predators (Lewis et al., 2001; Rowe, 2002; Schneider and Lewis,
2004). In order to evaluate the relative merits of the above theories,
more data and investigations are needed.
4.1 Data supporting the speech and gesture partnership
In assessing the role of gestures in interactive communications,
our personal interest was concentrated on integrating it with our
research on pausing strategies (summarized above). Our research has
suggested that pausing strategies are administered by clear cognitive
processes that aim at structuring the discourse and account both
for the listener's current knowledge and novelty of the information
Search WWH ::




Custom Search