Database Reference
In-Depth Information
is equivalent to double the weight of misclassified instances) could be subject of
exhaustive research.
Further work could focus on other classification trees construction methods,
as C4.5 [24] or Oc1 [21].
An extension of the presented approach is to select among the feature subset
that better performance presents by the classification point of view. A Feature
Subset Selection [12, 13, 26] technique can be applied in order to select which of
the predictor variables should be used. This could take advantage in the hybrid
classifier construction, as well as in the accuracy.
Acknowledgments
This work has been supported by the University of the Basque Country under
grant 1/UPV00140.226-E-15412/2003 and by the Gipuzkoako Foru Aldundia
OF-761/2003.
References
1. D. Aha, D. Kibler, and M. K. Albert. Instance-based learning algorithms. Machine
Learning , 6:37-66, 1991.
2. C. L. Blake and C. J. Merz. UCI repository of machine learning databases, 1998.
3. L. Breiman, J. Friedman, R. Olshen, and C. Stone.
Classification and Regression
Trees . Monterey, CA: Wadsworth, 1984.
4. T. M. Cover and P. E. Hart. Nearest neighbor pattern classification.
IEEE Trans.
IT-13 , 1:21-27, 1967.
5. R.G.Cowell,A.Ph.Dawid,S.L.Lauritzen,andD.J.Spiegelharter. Probabilistic
Networks and Expert Systems . Springer, 1999.
6. B. V. Dasarathy. Nearest neighbor (nn) norms: Nn pattern recognition classifica-
tion techniques. IEEE Computer Society Press , 1991.
7. T. G. Dietterich. Machine learning research: four current directions. AI Magazine ,
18(4):97-136, 1997.
8. Y. Freund and R. E. Schapire.
A short introduction to boosting.
Journal of
Japanese Society for Artificial Intelligence
, 14(5):771-780, 1999.
9. J. Gama.
Combining Classification Algorithms . Phd Thesis. University of Porto,
2000.
10. V. Gunes, M. Menard, and P. Loonis.
Combination, cooperation and selection
of classifiers: A state of the art.
International Journal of Pattern Recognition ,
17:1303-1324, 2003.
11. T. K. Ho and S. N. Srihati. Decision combination in multiple classifier systems.
IEEE Transactions on Pattern Analysis and Machine Intelligence , 16:66-75, 1994.
12. I. Inza, P. Larranaga, R. Etxeberria, and B. Sierra. Feature subset selection by
bayesian networks based optimization.
Artificial Intelligence , 123(1-2):157-184,
2000.
13. I. Inza, P. Larranaga, and B. Sierra. Feature subset selection by bayesian networks:
a comparison with genetic and sequential algorithms.
International Journal of
Approximate Reasoning , 27(2):143-164, 2001.
Search WWH ::




Custom Search