Graphics Reference
In-Depth Information
References
1. Aha, D.W., Kibler, D., Albert, M.K.: Instance-based learning algorithms. Mach. Learn. 6 (1),
37-66 (1991)
2. Aha, D.W. (ed.): Lazy Learning. Springer, Heidelberg (2010)
3. Alcalá-Fdez, J., Sánchez, L., García, S., del Jesus, M.J., Ventura, S., Garrell, J.M., Otero, J.,
Romero, C., Bacardit, J., Rivas, V.M., Fernández, J.C., Herrera, F.: KEEL: a software tool
to assess evolutionary algorithms for data mining problems. Soft Comput. 13 (3), 307-318
(2009)
4. Alcalá-Fdez, J., Fernández, A., Luengo, J., Derrac, J., García, S., Sánchez, L., Herrera, F.:
KEEL data-mining software tool: Data set repository, integration of algorithms and experi-
mental analysis framework. J. Multiple-Valued Logic Soft Comput. 17 (2-3), 255-287 (2011)
5. Alpaydin, E.: Voting over multiple condensed nearest neighbors. Artif. Intell. Rev. 11 (1-5),
115-132 (1997)
6. Angiulli, F., Folino, G.: Distributed nearest neighbor-based condensation of very large data
sets. IEEE Trans. Knowl. Data Eng. 19 (12), 1593-1606 (2007)
7. Angiulli, F.: Fast nearest neighbor condensation for large data sets classification. IEEE Trans.
Knowl. Data Eng. 19 (11), 1450-1464 (2007)
8. Antonelli, M., Ducange, P., Marcelloni, F.: Genetic training instance selection in multiobjec-
tive evolutionary fuzzy systems: A coevolutionary approach. IEEE Trans. Fuzzy Syst. 20 (2),
276-290 (2012)
9. Barandela, R., Cortés, N., Palacios, A.: The nearest neighbor rule and the reduction of the
training sample size. Proceedings of the IX Symposium of the Spanish Society for Pattern
Recognition (2001)
10. Barandela, R., Ferri, F.J., Sánchez, J.S.: Decision boundary preserving prototype selection for
nearest neighbor classification. Int. J. Pattern Recognit Artif Intell. 19 (6), 787-806 (2005)
11. Batista, G.E.A.P.A., Prati, R.C., Monard, M.C.: A study of the behavior of several methods
for balancing machine learning training data. SIGKDD Explor. Newsl. 6 (1), 20-29 (2004)
12. Bezdek, J.C., Kuncheva, L.I.: Nearest prototype classifier designs: An experimental study.
Int. J. Intell. Syst. 16 , 1445-1473 (2001)
13. Bien, J., Tibshirani, R.: Prototype selection for interpretable classification. Ann. Appl. Stat.
5 (4), 2403-2424 (2011)
14. Borzeshi, Z.E., Piccardi, M., Riesen, K., Bunke, H.: Discriminative prototype selection meth-
ods for graph embedding. Pattern Recognit. 46 , 1648-1657 (2013)
15. Brighton, H., Mellish, C.: Advances in instance selection for instance-based learning algo-
rithms. Data Min. Knowl. Disc. 6 (2), 153-172 (2002)
16. Brodley, C.E.: Recursive automatic bias selection for classifier construction. Mach. Learn.
20 (1-2), 63-94 (1995)
17. Cai, Y.-H., Wu, B., He, Y.-L., Zhang, Y.: A new instance selection algorithm based on contri-
bution for nearest neighbour classification. In: International Conference on Machine Learning
and Cybernetics (ICMLC), pp. 155-160 (2010)
18. Cameron-Jones, R.M.: Instance selection by encoding length heuristic with random muta-
tion hill climbing. In: Proceedings of the Eighth Australian Joint Conference on Artificial
Intelligence, pp. 99-106 (1995)
19. Cano, J.R., Herrera, F., Lozano, M.: Using evolutionary algorithms as instance selection for
data reduction in KDD: an experimental study. IEEE Trans. Evol. Comput. 7 (6), 561-575
(2003)
20. Cano, J.R., Herrera, F., Lozano, M.: Stratification for scaling up evolutionary prototype selec-
tion. Pattern Recogn. Lett. 26 (7), 953-963 (2005)
21. Cano, J.R., Herrera, F., Lozano, M.: Evolutionary stratified training set selection for extracting
classification rules with trade off precision-interpretability. Data Knowl. Eng. 60 (1), 90-108
(2007)
 
Search WWH ::




Custom Search