Database Reference
In-Depth Information
K Nearest Neighbor Edition to Guide
Classification Tree Learning: Motivation and
Experimental Results
J.M. Mart ınez-Otzeta, B. Sierra, E. Lazkano, and A. Astigarraga
Department of Computer Science and Artificial Intelligence,
University of the Basque Country, P. Manuel Lardizabal 1,
20018 Donostia-San Sebastian, Basque Country, Spain
ccbmaotj@si.ehu.es
http://www.sc.ehu.es/ccwrobot
Abstract. This paper presents a new hybrid classifier that combines the
Nearest Neighbor distance based algorithm with the Classification Tree
paradigm. The Nearest Neighbor algorithm is used as a preprocessing al-
gorithm in order to obtain a modified training database for the posterior
learning of the classification tree structure; experimental section shows
the results obtained by the new algorithm; comparing these results with
those obtained by the classification trees when induced from the original
training data we obtain that the new approach performs better or equal
according to the Wilcoxon signed rank statistical test.
Keywords: Machine Learning, Supervised Classification, Classifier
Combination, Classification Trees.
1
Introduction
Classifier Combination is an extended terminology used in the Machine Learning
[20], more specifically in the Supervised Pattern Recognition area, to point out
the supervised classification approaches in which several classifiers are brought
to contribute to the same task of recognition [7]. Combining the predictions of
a set of component classifiers has been shown to yield accuracy higher than the
most accurate component on a long variety of supervised classification problems.
To do the combinations, various strategies of decisions, implying these classifiers
in different ways are possible [32, 15, 7, 27]. Good introductions to the area can
be found in [9] and [10].
Classifier combination can fuse together different information sources to uti-
lize their complementary information. The sources can be multi-modal, such as
speech and vision, but can also be transformations [14] or partitions [5, 2, 22] of
the same signal.
The combination, mixture, or ensemble of classification models could be per-
formed mainly by means of two approaches:
Search WWH ::




Custom Search