Information Technology Reference
In-Depth Information
poses here, we might think about extending the
notion of a beeping backpack or a vibrating white
cane with an implant into the occipital cortex or
worn in a cap of a blind person. Bone-conducting
technologies are already being explored in the
context of managing major disasters, as discussed
later. One finding in the neuropsychological lit-
erature is that a loss of vision does not result in
the permanent inactivation of the visual cortex,
regardless as to the age of onset. Recent neuro-
imaging studies provide evidence suggesting that
functional activity in visual brain areas persists
in blind individuals when performing non-visual
cognitive tasks such as Braille reading, listening
to words, or sensory discrimination of auditory
or tactile stimuli (Burton et al., 2002a, b; Sadato,
Okada, Honda, & Yonekura, 2002). Such results
suggest that remaining sensory modalities have
the capability of being modified or activated
through cortical reorganization after severe visual
deprivation. Such an approach would preserve
the privacy of the person and also save the en-
vironment from yet another sound source. The
challenges in this technology genre thus includes
technical, psychological, and social issues, but
the potential benefits are very exciting. Next, we
turn to sound-basded technologies.
pitch to denote slope changes and monotonic
differences in the data over time (e.g. Flowers &
Hauer, 1995), these studies have generally shown
that it is relatively easy to perceive pitch- and
intensity-coded auditory data with a minimum of
training (Flowers et al., 2005; Zhao et al., 2008).
Thus, research involving the sonification of
graphs enables visually impaired and blind users
to perform typical data-inspection tasks such as
describing and depicting simple functions, exam-
ining the distribution properties of one or more
samples, and examining the covariation between
two variables. Research into navigation in physi-
cal space consists mainly of software installed in
portable, mobile devices that can detect obstacles
in the immediate surroundings, be that indoors
(Simpson et al., 2005; Choudhurry et al., 2004)
or outdoors (Trivedi, 2010).
More recently, an evolving tool developed in
our lab, iGraph-Lite (Ferres et al., 2007) enables
user to interact with simple line graphs using
natural language. In its current form, iGraph-Lite
relies on key commands and a Text-To-Speech
(TTS) engine. It comprises three subsystems: (1)
a knowledge representation system that enriches
a basic semantic representation of line-, bar- and
combination graphs, (2) a Natural Language
Generation (NLG) system that produces a static
description of a graph, and, (3) an interface that
allows users to navigate the full enriched represen-
tation of (1) by means of keyboard combinations,
much as JAWS or DOLPHIN. iGraph-Lite can
therefore be used to generate rich descriptions
to accompany graphs through the longdesctag
in plain HTML and, if a graph is published with
iGraph-Lite's full semantics embedded through
exif or a similar tool, the iGraph-Lite navigator
can be used to explore the graph at different
representational levels. The free plugin can be
downloaded from (http://www.inf.udec.cl/~leo/
igraph.html#sec3) to a desktop PC or to a mobile
device. Users can navigate a graph using a set of
spoken commands, which enable them to inter-
rogate the data in various ways. Currently, the
SOUND-BASED INTERACTIONS
Continuing the thread of using sound to support
navigation, studies in sound-based interaction both
with computers and with the physical environment
using mobile devices to aid navigation have been
published since the early 1990s (e.g. Flowers &
Hauer, 1992; Choudhurry et al., 2004). To date,
most of this research has involved sonification of
statistical data presented in visual graphs (Brown
& Brewster, 2003; Flowers, Buhman & Turnage,
2005; Rigas & Alty, 2005) or in geographical
maps (Zhao et al., 2005; 2008), aiming to render
visual data accessible to visually impaired and
blind audiences. By relying on differences in
Search WWH ::




Custom Search