Information Technology Reference
In-Depth Information
modal versus cross-modal effects on perceived
audio quality are summarized.
Finally, a summary is given that reviews the
most important concepts leading to the salience
model presented in the preceding section. Further
research potential is defined.
ing us to solve even ambiguous situations with
contradicting sensory information.
All these depth cues can be exploited even when
the environment is at rest. As soon as motion (of
objects or of the head) is present, motion parallax
takes on an important role in depth perception.
Motion parallax describes the fact that the image
of an object far away from the viewer moves more
slowly across the retina than the image of an object
at a close distance. Motion parallax also provides
cues in the monocular case.
MEcHANIsMs OF HUMAN
PErcEPtION
Vision (sight) and audition (hearing) are the most
important human senses for playing games. In
the real world, these senses provide us with in-
formation about the more remote surroundings,
as opposed to taste (gustation), smell (olfaction),
and touch (taction or pressure) which provide
information about our immediate vicinity. Be-
cause vision and audition communicate spatial
and temporal relations of objects, and because
the necessary technology to stimulate the two is
readily available on computer systems used in the
home, most games only stimulate the two.
Auditory Perception
Auditory stimuli are perceived to be localized in
space. The sound is not heard within the ear, but
it is phenomenally positioned at the source of the
sound. In order to localize a sound, the auditory
system relies on binaural and monaural acoustic
cues. Directional hearing in the horizontal plane
(azimuth) is dominated by two mechanisms which
exploit binaural time differences and binaural
intensity differences. For sinusoidal signals, inte-
raural time differences (ITDs, the same stimulus
arriving at different times at the left and the right
ear) can be interpreted by the human hearing
system as directional cues from around 80Hz
up to a maximum frequency of around 1500Hz.
This maximum frequency corresponds to a
wavelength of roughly the distance between the
two ears. For higher frequencies, more than one
wavelength fits between the two ears, making the
comparison of phase information between left and
right ear equivocal (Braasch, 2005). For signals
with frequencies above 1500Hz, interaural level
differences (ILDs) between the two ears are the
primary cues (Blauert, 2001). Regardless of the
source position, ILDs are small at low frequencies.
This is because the dimensions of head and pin-
nae (the outer ear visible on the side of the head)
are small in comparison to the wavelengths at
frequencies below about 1500Hz. Therefore they
do not represent any noteworthy obstacle for the
propagation of sound.
Visual Perception
Vision mainly serves to indicate spatial correla-
tion of objects, as the human visual system sel-
dom responds to direct light stimulation. Rather,
light is reflected by objects and thus transmits
information about certain characteristics of the
object. The direction of a visually perceived object
corresponds directly to the position of its image
on the retina, the place where the light receptors
are located in the eye. At the same time, a visual
stimulus occupies a position in perceptual space
that is defined relative to a distance axis, as well
as to the vertical and horizontal axes.
In the determination of an object's distance to
the eye, there are a number of potential cues of
depth. These include monocular mechanisms like
interposition, size, and linear perspective as well
as binocular cues like convergence and disparity.
All of these are usually evaluated jointly, allow-
Search WWH ::




Custom Search