Information Technology Reference
In-Depth Information
The lesson for us is that, while we may do everything possible to assure that
thought is properly inferred from representations, we can never prevent projective
construction. Interactive designs may actually wish to evoke projective construc-
tion in certain cases so that interactors can experience deeper, more personalized
connections. By planning where and when we wish such constructions to occur,
we may diminish the likelihood of their derailing the whole experience. Designing
moments that invite projective construction may allow interactors to feel a differ-
ence between such moments and others when correct inference is of greater sig-
nifi cance to the whole action.
Plays, like human-computer interactions, are closed universes in the
sense that they delimit the set of potential actions. As we will see in the dis-
cussion of action ahead, it is key to the success of a dramatic representation
that all of the materials that are formulated into action are drawn from the
circumscribed potential of the particular dramatic world. Whenever this
principle is violated, the organic unity of the work is diminished, and the
scheme of probability that holds the work together is disrupted.
This principle can be demonstrated to apply to the realm of human-
computer interaction as well. One example is the case in which the computer
(a computer-based agent) introduces new materials at the level of thought—
“out of the blue.” Suppose a text messaging system is programmed to be
constantly checking for spelling errors and to automatically correct them
as soon as they are identifi ed. Yes, you know this one—you want to type
“hell” and the program changes it to “he'll,” unless you know that you
can disregard the program's respectful correction by taking the additional
action of deleting its suggestion before the word is completed. If the poten-
tial for this behavior is not represented adequately, it is disruptive when it
occurs, and it will probably cause the person to make seriously erroneous
inferences—e.g., “something is wrong with my fi ngers, my keyboard, or
my software.” The program “knows” why it what it did (“thought” exists)
but the person doesn't; correct inferences cannot be made. 13
13. In human factors discourse, this type of failure is attributed to a failure to establish the
correct conceptual model of a given system (see Rubinstein and Hersh 1984, Chapter 5). The
dramatic perspective differs slightly from this view by suggesting that proper treatment of
the element of thought can provide a good “conceptual model” for the entire medium. It also
avoids the potential misuse of conceptual models as personal constructs that “explain” what is
“behind” the representation; i.e., how the computer or program actually “works.”
 
Search WWH ::




Custom Search