Information Technology Reference
In-Depth Information
we have put these to work in various ways. The
projects are based on and inspired by an ecologic
and everyday-listening approach to sound, like
the ones proposed by R. Murray Shafer, William
Gaver, and their followers.
As human beings, we are good at interpreting
the soundscape constantly surrounding us. When
we hear a sound we can make relatively accurate
judgments about the objects involved in generating
the sound, their weight, the materials they are made
of, the type of event or series of events that caused
the sound, the distance and direction to the sound
source, and the environment surrounding the sound
source and the listener, for example. Much of the
existing research on sound and auditory perception
is about how to convey clear and unambiguous
information through sound. In computer games,
however, the aim is also to create other effects,
effects that have as much to do with emotions, the
subconscious, intuition, and immersion as they do
with clear and unambiguous messages.
This article describes a couple of projects in
which we have worked with the balance between
eye and ear, between ambiguity and un-ambiguity,
between cognition and intuition and between body
and mind. The aim has been to create experiences
built on a multitude of human abilities and af-
fordances, mediated by new media technology.
In a traditional computer game setting, the TV
screen or computer monitor is the center of atten-
tion. The screen depicts the virtual game world and
the player uses some kind of input device, such as
a game pad, a mouse, a keyboard, or a Wiimote, to
remotely control the virtual gameworld and objects
and creatures in it. The action takes place in the
virtual world and the player is naturally detached
from the game action by the gap between the
player's physical world and the virtual world of
the game. Much work has to be done and complex
technology used in order to bridge that gap and
to have the player experience a sense of presence
in the virtual gameworld. The aim is to make the
player feel as immersed as possible in the game
experience and to make her suspend her natural
disbelief. To achieve this, the computer game
industry must build broader and broader bridges
over the reality gap to make the virtual game reality
more immersive. The traditional way to increase
immersion and suspension of disbelief has primar-
ily been to increase graphics capability and, today
we can enjoy near photo-realistic, 3D-graphics in
real time. But there might be alternative ways to
tackle the problem. Potentially, computer games
could be more engaging and immersive without
having to build long and broad bridges over the
reality gap. What about narrowing the gap instead
of building broader bridges over it?
bAcKGrOUND
Sound and light work in different ways and reach
us on complementary channels. Our corresponding
input devices, the visual and auditory perceptions,
show both similarities and differences and we
have an innate ability to experience the world
around us by combining the visual, auditory, touch
and olfactory perceptions into one, multimodal
whole. We are built for and used to handling
the world through a balanced mix of perceptual
input from many senses simultaneously. This
can be exemplified in different ways. One is by
crossmodal illusions, for example, the McGurk
effect (Avanzini, 2008, p. 366) which shows how
our auditory perception is influenced by what we
see. Another example is the ventriloquist illusion
in which the perceived location of a sound shifts
depending on what we see (O'Callaghan, 2009,
section 4.3.1). If the signal on one sensory channel
is weak, we more or less automatically fill in the
gaps with information from other channels and,
in this way, we are able to interpret the sum of
sensory input and make something meaningful
of that sum. Watching lip movements in order to
hear what your friend is saying at a noisy party is
just one everyday example of this phenomenon.
A third example is Stoffregen and Bardy's con-
cept of “global array” (Avanzini, 2008, p. 350).
Search WWH ::




Custom Search