Information Technology Reference
In-Depth Information
Fig. 11.1. Russell's circumplex model of emotions (Russell 1980)
In the case of automatic music classification, the number of emotion
terms is typically higher than in psychological research. Hu et al. (2009)
used 18 categories containing 135 mood tags, and the recent MIREX
(www.music-ir.org/mirex) evaluations used 29 labels divided into five
clusters (Kim et al. 2010). However, in mood-based music player
interfaces targeted at the end-user (see the next section), the number of
mood terms is typically smaller.
The dimensional approach focuses on “identifying emotions based on
their placements on a small number of dimensions, such as valence,
activity, and potency” (Sloboda & Juslin 2001, p. 77). The most well-
known dimensional emotional scale is probably Russell's (1980)
circumplex model (Fig. 11.1), which maps the y-axis to activation level
and the x-axis to valence. In the model, emotions are located in such a
manner that the opposite emotions (e.g., happy and sad) face each other.
In the context of MIR, the most popular dimensional model is Thayer's
two-dimensional valence-arousal space, which is a simplified version of
Russell's more general model (Kim, Schmidt, and Emelle 2008). Thayer
maps the x-axis to stress and the y-axis to energy (Trohidis et al. 2008),
and thus categorizes music into four quadrants: high valence and arousal
(joy, exuberance), high valence and low arousal (contentment), low
valence and high arousal (anger), and low valence and arousal
(depression). For a state-of-the art review on music emotion recognition,
see Kim et al. (2010).
 
Search WWH ::




Custom Search