Biomedical Engineering Reference
In-Depth Information
the T9 interface can adapt to the user's behavior by means of a custom dictionary
where all exceptions are stored.
The principle behind ubiquitous computing and ambient intelligence is to form a
network of computers and interfaces that surround the user, sharing input and output
of individual machines and creating a synergy of interactions [ 64 ].
In the HCI context, each channel of communication between human and machine
is called a modality . This concept allows separating interfaces as unimodal and mul-
timodal ones. Multimodal interfaces are becoming an increasingly prominent field
of research [ 67 ]. However, multimodal realizations are only a combination of mul-
tiple unimodal systems. For the purpose of this overview it is enough to describe in
detail only the building blocks, i.e. unimodal systems. More insight into multimodal
systems is given in the context of multiscale interaction in the following sections.
Examples of Unimodal Systems in HCI
Prominent examples for visual systems are algorithms that detect human faces on
digitized images [ 68 ]. Starting with the simple detection, these methods soon evolved
to analyze facial expressions. Improvements in computational power and resolution
of images enabled full body movement tracking with markers [ 69 ] and without
markers [ 70 ]. This progress allowed implementing gesture recognition systems [ 71 ].
Another interesting technique is gaze detection and eye tracking, which is commonly
used as a form of communication for disabled people [ 72 ].
Audio systems have been growing in popularity since robust cloud based voice
recognition systems have been introduced [ 73 ]. However, traditional desktop voice
recognition systems are also starting to offer ways to recognize the speaker, based
on a pre-learned database of voices [ 74 ]. Another approach to audio systems is the
extraction of the emotional state of the speaker (laugh, cry, sigh) from audio signals
in addition to the word content [ 75 ].
Sensor systems are commonly used. Keyboard, mouse, joystick are most promi-
nent examples of sensor systems [ 62 ]. Pen based input grew popular for some time,
as a highly natural means of communication, but because the implementation was
unreliable and expensive, the idea of Tablet PC computing never became main-
stream. With the renaissance of tablets, manufactured as light and mobile devices, it
is expected that the pen-based input will become as important as touch-based inter-
action like keyboard and mouse [ 76 ]. Haptic and pressure interactions are valuable
for robotics and medicine. These sensors allow robots to sense the environment as
humans do, to enable telepresence application [ 77 ]. Biomedical approaches consider
the utilization of sensor systems during microsurgeries. The use of the robotic inter-
face could help translating senses of surgeon from a small movement scale to familiar
macro surgery scale [ 78 ].
Search WWH ::




Custom Search