Graphics Reference
In-Depth Information
Finally the last parameter, PWR, controls the dynamic properties of
the movement. Powerful movements are expected to have higher
acceleration and deceleration magnitudes. They should also exhibit
less overshoot. This effect is obtained by mapping the power value to
tension and bias parameters of TCB spline curves. Thus, similarly to
the fluidity parameter, power modifies the trajectory of the movement.
By controlling the PWR value, one may specify gestures that are weak
and relaxed or, oppositely, strong and tensed.
4. Conclusion
In this chapter, we focused on the expressive quality of gesture in the
HCI perspective. We discussed two aspects which are complementary:
expressive gesture quality analysis and synthesis. We argued that
often the same features detected in human behavior are synthesized
on virtual agents. At the same time, different computational methods,
both for analysis and synthesis, may focus on the same expressive
feature. For this reason we conclude the chapter comparing these
features and computational methods, both for analysis and synthesis.
The result of this comparison is presented in Table 1.
Table 1 shows that, in particular, Fluidity and Power do not have
a single universal interpretation. These features are also difficult to
analyze and synthesize. Indeed, in the evaluation of the model by
Hartmann et al. (2005), these two expressive features received the
lowest recognition rates when synthesized with the Greta agent. It
is easy to notice that sometimes different algorithms model similar
features using different names, as it is in the case of Power and Tension.
To summarize this chapter, one can observe that there is a growing
interest in the analysis and synthesis of the expressive quality of
gesture. It also spreads over other research domains such as robotics
(e.g., Le et al., 2011). However, more research is needed to specify and
model expressive features of nonverbal behavior in a more realistic
way. The recent developments of less invasive motion capture-based
methods seem to be a promising methodology to study expressive
gesture quality and to build new expressive models. The creation of
efficient broad-consumer tools, such as Kinect, opens new challenges
in the domain of the expressive gesture quality analysis.
Acknowledgements
We would like to thank Dr. E. Bevacqua from National Engineering
School of Brest (France), Dr. G. Volpe from University of Genoa (Italy),
Search WWH ::




Custom Search