Information Technology Reference
In-Depth Information
they lead us to conclude that implementing a cooperating MMD system is not
an easy thing and that the links between the theories and the concrete
examples are not easy to create. Some system designers redefine their notion
of cooperation with preoccupations that are closer to technical aspects or
directly implementable criteria. One of the aspects of cooperation in Luzzati
[LUZ 95, p. 39] falls within the scope of the system's management of its own
errors. Cooperation is materialized mostly by the choice of answers the
system gives to the user's utterances. If we take as an example “how long with
this itinerary?”, the first answer that the system could give would be “two
hours”. In that case, the system, eventually referred to as a communicant,
simply answers the question asked without adding to or subtracting anything
from the queried value. A second possible answer could be “two hours due to
a change at Versailles”. In this case, the answer to the question is partly
assessed and partly explained. It is assessed by the system which, based on
criteria such as the average length of a single journey between Palaiseau and
Paris, observes that a 2 hour journey is long and might not please the user. It is
then explained by the system, which looks for and describes the main reason
behind the length of time. With such an answer, the system can be considered
as cooperating. However, it remains within its role of system in charge of
satisfying the user's queries. Or, a third possible answer goes beyond this role:
by answering “two hours, due to a change in Versailles, but if you go through
Meudon you'll get there in fifty minutes”, the system shows an increased
level of cooperation - we can then refer to it as a collaborating system -
which is to suggest a change of direction to the user, trying a different
itinerary than that we had originally chosen. This change in direction, if the
user accepts it, leads to the joint construction of a common goal.
8.1.2. Speaking turns and interactive aspects
The train ticket reservation task can lead to natural dialogues, as in the
introduction's example, but also to situations which are closer to the caricature
of a certain type of communication between a human being and a machine: “I
would like to go to Paris”, “what day?”, “it will be tomorrow”, “what time?”,
“let's say around nine o'clock”, “what class?”, “first, please”, etc. The user
can generate all kinds of imaginable utterance; the artificial aspect arises here
from the system. By only generating questions and always phrasing them in
the same way, the system is indeed helping the task progress but is not at all
contributing to the linguistic aspects which make a natural dialogue in natural
Search WWH ::




Custom Search