Information Technology Reference
In-Depth Information
Figure 11.7.
A knowledge base about sore joints
joints.pl
% What is known about sore joints
background([ [sore_elbow(X), tennis_elbow(X)],
[sore_elbow(X), sore_joints(X)],
[sore_joints(X), arthritis(X)],
[sore_hips(X), sore_joints(X)],
[sore_hips(X), hip_fracture(X)],
[tennis_player(X), tennis_elbow(X)] ]).
% The possible explanations to consider
assumable(tennis_elbow(_)).
assumable(arthritis(_)).
?- background(K), est([[A]|K],[sore_elbow(sue)]).
A = sore_elbow(sue)
;
A = tennis_elbow(sue)
;
A = sore_joints(sue)
;
A = arthritis(sue)
;
No
Thus there are four possible atoms A that could be added as clauses to the background
knowledge K to get the desired conclusion. The first atom, sore_elbow , is the trivial
explanation as before. The third atom, sore_joints , may or may not be useful as
an explanation. The other two, tennis_elbow and arthritis , are likely the desired
explanations for sore_elbow .
Often only explanations drawn from some prespecified assumption set or hypothe-
sis set are of interest. A typical case of explanation is (medical) diagnosis . In diagnosis,
some of the atoms are considered to be possible symptoms , and some of the atoms
are considered to be possible diseases . Observations that are among the symptoms are
given to a diagnostician, and an explanation is sought among the diseases. In gen-
eral, what one is willing to assume depends on the problem, and that is what the
assumable predicate in figure 11.7 is for:
?- background(K), assumable(A),
est([[A]|K], [sore_elbow(sue)]).
A = tennis_elbow(sue)
;
A = arthritis(sue)
;
No
Note that the explanations that are generated depend on what needs to be explained.
For example, suppose Sue also has sore hips:
Search WWH ::




Custom Search