Robotics Reference
In-Depth Information
of robot behaviour, we will come to accept that robots have passed the
being-in-love version of the Turing Test. Once the simulation of being
in love is as convincing as what we call “the real thing”, the simulation
will appear as reality and we will believe a robot that says “I love you”.
How will this simulated love be programmed? There has been lit-
tle research thus far on the problem of enabling robots to love. One
academic whose work has touched on this subject is Aaron Sloman at
the University of Birmingham. Sloman has categorised certain states of
mind as tertiary emotions . Included in this group are anger, longing, grief,
guilt, jealousy, excited anticipation and infatuation. As an example of the
conditions that must exist to give rise to one of these tertiary emotions,
Sloman discusses the question “Why can't a goldfish long for its mother?”
Sloman explains
Longing for one's mother involves at least: (i) knowing one has a
mother; (ii) knowing she is not present; (iii) understanding the pos-
sibility of being with her; and (iv) finding her absence unpleasant.
These all involve possessing and manipulating information, for ex-
ample information about motherhood, about one's own mother,
about locations and change of location, and about the desirability
of being close to one's mother. [12]
But as Sloman points out, these conditions are not sufficient to bring
about the emotion of longing.
If someone in Timbuctu whose mother is in Montreal satisfies all
four of these conditions but hardly ever thinks about his mother
and simply gets on with his job, enjoying his social life and always
sleeps soundly, then that is not a case of longing. He may regret
her absence (which is an attitude), but he does not long for her (an
emotion). Longing for someone requires something more, namely:
(v) not easily being able to put thoughts of that someone out of
one's mind. (Although this is not necessarily so in a case of mild
longing!) This is not just a matter of definition: it is a fact that some
human mental states involve partial loss of control of attention.
You can not lose what you have never had. So a requirement for
being in such a state is having the ability sometimes to control
what one is thinking of and also being able sometimes to lose that
control. [12]
Sloman emphasizes that this requirement in a robot assumes that some
part of the robot's “information mechanism” (i.e., its programming) can
Search WWH ::




Custom Search