Information Technology Reference
In-Depth Information
(a)
(b)
Figure 8.3
A stick-type input device. (From Nakaie, T., Koyama, T., and Hirakawa, M. Development of a
Collaborative System with a Shared Sound Display , 14-19, 2008. With permission.)
8.3.2 Processing Flow
Figure 8.4 shows a processing flow of the system. We adopted MAX/MSP
with Jitter as a tool for implementing software modules.
The system receives inputs from a pair of cameras and an accelerometer
provided in the stick-type input device (i.e., Wii Remote). 3-D position and
motion tracking data are analyzed to recognize a user's gesture. We define
five primitive gestures as will be explained in the section below. Here, the
task of identifying gestures is separated from that of their semantic interpre-
tation in a target application so that development of the application can be
made easier. The result of interpreting a gesture is expressed in a message
and sent to audio and visual output processing parts. Here we adopt Open
Sound Control (OSC) protocol [21] for message communication among the
components.
In receiving the message, the system executes polyphony control to sound
multiple notes at one time. Sound is then generated and comes out through
16 speakers. Meanwhile, graphical objects are generated in OpenGL and pro-
jected on the surface of Sound Table.
8.3.3 Primitive gestures
We define five gestures that are simple, natural, and powerful so as to be
used in a variety of applications. They are tap, sting and release, attack, flick,
and tilt as explained below (see Figure 8.5).
Tap is a gesture of touching the head of the stick device down onto the
tabletop. It specifies the 2-D position on the table and corresponds to a mouse
click operation. Internally, when a tap gesture is recognized, a message with
its coordinate values is generated.
Search WWH ::




Custom Search