Graphics Reference
In-Depth Information
Interactions like the two-handed multitouch example above are the simplest
cases of what are being called natural user interfaces ( NUIs ). These are inter-
faces that can involve multiple nondeterministically decoded channels of commu-
nication, leveraging our different senses (e.g., the ability to point with a finger
while giving instructions by voice). Not surprisingly, the decoding of multiple
streams of data into a coherent goal can be very challenging. One particular chal-
lenge is that in the WIMP interface, each interaction is purposeful and demarcated:
We start an action by pressing a start button, for instance, and the meaning is com-
pletely clear. But for a camera-based interface that watches a user's face or hands
for indication of an action to take, there's no clear delimiting of the action; the
system must infer the start and end.
21.2.1 Prescriptions
We conclude these generalities with a few ideas that are important for anyone
designing any kind of interface. There are no absolute prescriptions in interaction
design except, perhaps, “You should test your design on real users.” Designs must
often satisfy the needs of both beginners and power users, and until the design is
widely adopted, it's not certain that it will ever have power users. Designs must
work within a budget: Interaction may be allocated only a tiny fraction of proces-
sor time, pixel fill rate, or other resources. As processor speed, fill rate, bandwidth,
and other factors change, the sweet spot for a design can shift substantially.
For every design, some degree of responsiveness and fluidity is essential.
When you click a button on a GUI, you need to know that the click was detected
by the program: The button should change its appearance, and perhaps you should
get audio feedback as well. It's essential that these happen apparently instantly—
by the time there's a lag of even 0.2 sec , the interface begins to feel clunky and
unreliable. The more “immediate” the GUI feels, the more critical prompt feed-
back becomes: When we feel separated from the computer, treating it as a device
or machine, some delay is tolerable. The more we perceive it as “real,” the more
we expect things to behave as they do in the real world, that is, with instant feed-
back. With modern controllers—you use your hand to select from a menu in many
Kinect-based games, for instance—the feeling of reality is substantially enhanced,
and real-time feedback is essential. In fact, the separation of an interaction loop
(something that receives and processes interrupts from interaction devices, with a
high processor priority) into its own high-priority thread of execution is critical to
maintaining a sense of hand-to-eye coordination, and a feeling of fluidity in the
interface.
The need for instant feedback and fluidity is context-dependent: A WIMP
desktop GUI may need smooth feedback, but a twitch game demands it—players
get annoyed when their on-time interactions register too late to be effective! In
a virtual reality environment, it becomes critical: Failure to update the interface
(which may be the entire scene!) can lead to cybersickness (nausea due to incon-
sistent apparent motion). Thus, sufficiently rapid feedback becomes almost as
severe a constraint as hard-real-time scheduling.
There are automobiles that seem “right” the moment you sit in them. You can
tell instantly where all the controls are. As you grab the steering wheel, you notice
that there are buttons nearly under your thumbs, in easy reach, but placed so that
you won't trigger them accidentally. When you shift the transmission, the current
gear is displayed clearly but subtly. When a display element changes discretely,
 
 
Search WWH ::




Custom Search