Information Technology Reference
In-Depth Information
To support such a rule selection, much effort has gone into the use of objec-
tive rule evaluation indices such as recall, precision, and other interestingness
measurements (hereafter, we refer to these indices as “objective indices”). Fur-
ther, it is dicult to estimate the criterion of a human expert using a single
objective rule evaluation index; this is because his/her subjective criterion, such
as the interestingness or importance for his/her purpose, is influenced by the
amount of his/her knowledge and/or the passage of time. In addition, rule selec-
tion methods have never explicitly re-used the history of each rule evaluation,
such as focused items and the relationships between items, which are only stored
in the mind of the human expert.
With regard to the above-mentioned issues, we have developed an adaptive
rule evaluation support method for human experts that use rule evaluation
models. This method predicts the experts' criteria based on objective indices
by re-using the results of the evaluations by human experts. In Section 6.3, we
describe the rule evaluation model construction method based on objective in-
dices. Then, in Section 6.4, we present a performance comparison of learning
algorithms to obtain rule evaluation models. With the results of the compar-
isons, we discuss the applicability of the constructive meta-learning scheme as a
learning algorithm selection method for our rule evaluation model construction
approach.
6.2
Interestingness Measures and Related Work
Much effort has been expended to develop a method to select valuable rules from
large mined rule set based on objective rule evaluation indices. Some of these
works suggest indices for discovering interesting rules from such a large number
of rules.
To avoid confusing real human interest, objective index, and subjective index,
we clearly define these thems as follows: Objective Index : afeature,suchas
the correctness, uniqueness, or strength of a rule, calculated by the mathematical
analysis. It does not include any human evaluation criteria. Subjective Index:
The similarity or difference between the interestingness information given be-
forehand by a human expert and that obtained from a rule. Although it includes
some human criteria in its initial state, the similarity or difference is mainly
calculated by mathematical analysis. Real Human Interest: The interest felt
by a human expert based on a rule in his/her mind.
Focusing on interesting rule selection with objective indices, researchers have
developed more than forty objective indices based on the number of instances,
probability, statistical values, quantity of information, distance of rules or their
attributes, and complexity of a rule. [1, 2, 3]. Most of these indices are used to
remove meaningless rules rather than to discover interesting ones for a human
expert, because they cannot include domain knowledge.
Ohsaki et al.[4] investigated the relation between objective indices and real
human interests, through a consideration of real data mining results and their
human evaluations. In this work, a comparison showed that it was dicult to
 
Search WWH ::




Custom Search