Biomedical Engineering Reference
In-Depth Information
a system based on speakers transmitting complex auditory-spatial signals to analyze the
pattern recognition performance of sighted users; moreover, Golledge and colleagues
(1991) and Shinn-Cunningham and colleagues (1996) designed a system that simulates
realistic sound sources from different locations by using loudspeakers.
The application of sonification techniques to the design process seems to be useful in
creating mobility aids, i.e., electronic travel aids (ETAs) able to “detect the environment
within a certain range or distance, process reflected information, and furnish the user
with certain information in an intelligible and useful manner” (Farmer and Smith 1998
p. 238). Sonar techniques are the most used blind mobility aids and allow users to perceive
the spatial information of the environment by means of a source that transduces an ultra-
sound signal into an auditory or haptic feedback (Kay 1964). As defined by Farmer and
Smith (1998), it is possible to distinguish four categories of ETAs:
1. Devices with a single output for object preview, for example, devices emitting
audiotactile feedback indicating the obstacles encountered in the user's path—e.g.,
the Mowat Sensor (Morrissette et al. 1981) or the Sonicguide (K ay 1974);
2. Devices with a multiple output for object preview, for example, the Laser Cane
proposed by Benjamin, a walking cane receiving and transmitting spatial signals
to help blind people to explore and move within an urban environment (1973, 1974);
3. Devices providing both object preview and environmental information, for exam-
ple, the Kay's Advanced Spatial Perception Aid Technology—KASPA (2000), an
ultrasonic device (needing approximately a month of training) designed to allow
users to avoid obstacles during their mobility in the surrounding environment
(K ay 20 01);
4. Devices using artificial intelligence as a component, for example, the Sonic
Pathfinder, a sonification tool designed by Heyes (1984) to help blind people to
avoid obstacles by translating the objects found in front of the users into musical
notes conveyed by five input/output loudspeaker devices.
Unless all mentioned studies are focused on sensory substitution for blind people, a crit-
ical  issue emerges: None of the proposed systems and models have been evaluated by
assessing the accessibility and the usability of the corresponding sonification devices,
showing a lack of an effective user-centered approach in the design process. This question
could be explained by noting that most of the above-mentioned studies have been carried
out by conducting the design process through an objective perspective. In fact, users have
been involved only after the prototype was developed, excluding in this way the subjec-
tive perspective, which is fundamental to analyzing the components of the interaction
between user and the sonification interface.
One of the first studies that attempted to build a sonification system by following a
user-centered approach was proposed in the 1990s by Meijer (1992), who carried out an
experimental analysis of the system in an everyday life context. In his work, Meijer intro-
duced the vOICe system, a software created to “allow blind people literally to see through
sounds” by a continuous horizontal scan of the real-life environment, which is recorded
by a head-mounted camera analyzing and translating the surrounding scene into a sine
wave acoustic signal. To detect the neural activation in sighted and blind subjects dur-
ing the object recognition tasks, the vOICe system has been recently analyzed by using
fMRI (Amedi et  al. 2007). Following this technique, the authors found that the spatial
navigation by means of vOICe is related to the activation of lateral-occipital tactile visual
Search WWH ::




Custom Search