Robotics Reference
In-Depth Information
occurs, the planner allows the emotion process module to update its state
of mind. This updating process is important as it allows the robot's state
of mind to change because of the planner's actions, which can in turn af-
fect the planner's behaviour choice, thereby providing a dynamic system
with immediate emotional feedback. Furthermore, while the planner is
carrying out an action it may receive a message from another robot or a
human, which could affect its goals.
How Emotions Are Affected by Events
A robot can be pleased or displeased by events that happen to it, includ-
ing its own actions. How it feels about an event depends on the robot's
goals, which can be anything that the robot wants, such as “I want to eat”
(an example of what was originally specified in Oz to be an active goal—
one whose outcome the robot can influence), or “I want the Mets to win
the World Series” (a passive goal—one that the robot cannot influence).
Later in Oz' development, no distinction was made between active and
passive goals. Instead its goals were all handled by the robot's motivation
system.
When there is a “to eat” goal the event of eating dinner will be judged
by the robot as being pleasing. Events that have already taken place give
rise to joy and distress emotions with an intensity based on a number of
factors, including how pleasant or unpleasant the event was found to be.
Originally in Oz, the prospect of future events gave rise to the emotions
fear and hope, also with intensities that are determined by how pleasant
or unpleasant a potential future event was expected to be. But later this
approach was replaced by a set of inference rules that were specifically
tailored to achieve the robot's goals. Similarly, some of Oz' emotions
(anger, gratitude, gratification and remorse) were also later modelled by
inference rules that were attached to important goals, rules that assigned
credit or blame when a goal succeeded or failed (or appeared likely to do
so), so that the corresponding emotion would be generated. For example,
if the robot had an important goal that failed, and if the robot inferred
that the failure was due to some action by Fred, then the emotion module
would generate a feeling of anger towards Fred.
Clark Elliott and Greg Siegle have explored the way in which the
intensity of an emotion can change as a result of changes in a robot's
situation. This is an important aspect of the simulation of emotion. They
defined three different types of measure of emotional intensity. One
Search WWH ::




Custom Search