Information Technology Reference
In-Depth Information
CHAPTER 16
Schizophrenia and narrative
in artificial agents
Phoebe Sengers
Cornell University, Ithaca, NY
The premise of this work is that there is something deeply missing from AI,
or, more specifically, from the currently dominant ways of building artificial
agents. This uncomfortable intuition has been with me for a long time, per-
haps from my start as an AI researcher, although for most of that time I was
not able to articulate it clearly. Artificial agents seem to be lacking a primeval
awareness, a coherence of action over time, something one might, for lack of a
better metaphor, term 'soul.'
Roboticist Rodney Brooks expresses this worry eloquently:
Perhaps it is the case that all the approaches to building intelligent systems
are just completely off-base, and are doomed to fail.... [C]ertainly it is the case
that all biological systems.... [b]ehave in a way which just simply seems life-like
in a way that our robots never do.
Perhaps we have all missed some organizing principle of biological systems,
or some general truth about them. Perhaps there is a way of looking at biolog-
ical systems which will illuminate an inherent necessity in some aspect of the
interactions of their parts that is completely missing from our artificial sys-
tems.... [P]erhaps at this point we simply do not get it , and... there is some
fundamental change necessary in our thinking... [P]erhaps we are currently
missing the juice of life.
(Brooks 1997: 299-300)
Here, I argue that the 'juice' we are missing is narrative . The divide-and-
conquer methodologies currently used to design artificial agents results in frag-
mented, depersonalized behavior, which mimics the fragmentation and de-
personalization of schizophrenia in institutional psychiatry. Anti-psychiatry
and narrative psychology suggest that the fundamental problem for both
schizophrenic patients and agents is that observers have difficulty understand-
Search WWH ::




Custom Search