Graphics Reference
In-Depth Information
the intentions and emotional states to be conveyed in synchronized
sequences of multimodal behaviors. These behaviors are encoded into
the Behavior Mark-up Language—BML (Kopp et al., 2006; Vilhjálmsson
et al., 2007). The third module, Behavior Realizer, receives this list of
signals and computes the corresponding animation. In the remainder
of this section, we will detail these representation languages.
3.1 Emotion Mark-up Language—EmotionML
Emotions are crucial for simulating liveliness in virtual agents.
Emotion is a complex phenomenon involving cognitive processes
and physiological and behavioral changes. Emotion is defined as “an
episode of interrelated, synchronized changes in several components
in response to an event of major significance…” (Scherer, 2000). It is
largely accepted that five major components are part in the emotion
process (Scherer, 2000):
￿ Neurophysiological and autonomous nervous patterns (in central
and nervous systems)
￿ Motor expression (in face, gesture, gaze)
￿ Feelings (subjective experience)
￿ Action tendencies (action readiness)
￿ Cognitive processing (mental processes)
Different theories of emotion have been defined. We report here
the three main theories:
￿ Discrete theory which speculates there is an innate neural program
that leads towards specific bodily response patterns (Ekman
and Friesen, 2003); there exist prototypical facial expressions for
emotions.
￿ Dimensional theory that describes emotion along several
dimensions. The circumplex of Russell (Russell, 1980) and PAD
(Merhabian, 1996) are two of the main examples. The first one
considers two dimensions: Pleasure and Arousal, whilst the
second adds a third dimension, namely Dominance.
￿ Appraisal theory that views emotions as arising from the
subjective evaluation of the significance of an event, of an object,
of a person (Arnold, 1960; Scherer, 2000).
The first attempts (e.g. VHML (Beard and Reid, 2002)) to control
virtual agents considered the six basic emotions (Ekman and Friesen,
2003). Recently EmotionML (Schröder et al., 2011), a W3C standard,
was designed to allow for describing emotions using any of the three
Search WWH ::




Custom Search