Information Technology Reference
In-Depth Information
number of options are available soundwise that
can be supplied to the player through a database.
As an example, you can, and probably should,
limit the number of weapons accessible to the
player. Every single weapon should be discern-
ible from any other weapon through its sound in
order to enhance the semantic value. Our point
here is that the player is supplied with a number
of objects with which to play the game. Many of
them produce sound effects within the diegesis of
the game, such as shots. As this example shows,
the IEZA-framework is useful in this part of the
process, discerning what kind of sound belongs
where in the game's structure. You have some,
but not total, control over when, why and where
the player will use these play objects.
In the above, our focus is on the sonic envi-
ronment of computer games and the problem of
balancing the sounds in relation to each other.
However, very few games consist of sound alone
(notable examples can be found on websites such
as http://www.audiogames.net/). What happens,
then, when a game consisting of sound and graphi-
cal elements is played? What does sound provide
to this experience? In the following section we
discuss a case study that relates to this issue in
general and the use of sound as a means of direct-
ing the player in particular.
According to Ong (1982/90), vision and
hearing have a basic bipolarity. The aim of the
following paragraph is to discuss the relation
between vision and hearing, as well as Ong's
suggested bipolarity of these two and how this
relates to immersion. Vision separates us from
the environment, making the limits of our bodily
containers protrude, whereas sound integrates
us with the environment, blurring the border be-
tween the container of the self and the adjacent
environment. Ong's theory, which is concerned
with the differences between written and spoken
language, might seem odd to use in relation to a
model for the analysis and production of computer
game audio. Nevertheless, we find his remark
about this bipolarity highly relevant with regard
to understanding the function that sound has
in audiovisual constructions such as computer
games. Think about it; in order to fully take in
an environment through vision we need to move
around and turn our eyes towards what we would
like to see (cf. Ong, 1982/90; Gibson, 1986). Ong
actually refers to the immersive effect that high-
fidelity audio reproduction accentuates. Sight is
limiting, but hearing is not in the same way. We
can hear what is behind us and then turn around
to see it. If sound integrates rather than separates
us from the surrounding environment, it would
seem reasonable that sound and immersion have
a strong relationship. Integration through sound
might lead to immersion. Hearing is, on the other
hand, also a selective process. We may, to some
extent, filter out uninteresting and disturbing
sounds. A construed audiovisual environment is
a prefiltered1into which sound and images have
been put through the selective processes of their
creators. We therefore discuss hearing, vision, the
visual, and affordances (Gibson, 1977, 1986) in
relation to the sonic environments of computer
games.
To some extent, the bipolarity of hearing and
vision is innate. Biologically, we develop hearing
before we can see. A human fetus can normally
perceive sound from week 15 after conception
and the ears are usually fully developed by week
24. The fetus is surrounded by amniotic fluid and
this underwater environment is an immersive one
that completely encloses us. We are immersed
and can feel touch from week ten. In fact, one
of the primary definitions of immersion used in
the context of computer games clearly connects
the concept of immersion with being under water
(Murray, 1997, pp. 98-99). Hearing the environ-
ment precedes seeing it, in terms of how these
senses develop from conception, and feeling the
environment precedes hearing it. In the womb,
movement is restricted and, as newborns, we have
no locomotion and must be transported by oth-
ers. Sight is still limited and objects in the visual
field need to be very close to be in sharp focus,
Search WWH ::




Custom Search