Graphics Reference
In-Depth Information
involved in the experimental study. According to them, ReliefF turned out to be the
best option independent of the particulars of the data, adding that it is a filter with low
computational cost. Like in the study mentioned before, wrapper approaches have
proven to be an interesting choice in some domains, provided they can be appliedwith
the same classifiers and taking into account that they require higher computational
costs.
References
1. Aha, D.W. (ed.): Lazy Learning. Springer, Berlin (2010)
2. Almuallim, H., Dietterich, T.G.: Learning with many irrelevant features. In: Proceedings of the
Ninth National Conference on Artificial Intelligence, pp. 547-552 (1991)
3. Arauzo-Azofra, A., Aznarte, J., Benítez, J.: Empirical study of feature selection methods based
on individual feature evaluation for classification problems. Expert Syst. Appl. 38 (7), 8170-
8177 (2011)
4. Arauzo-Azofra, A., Benítez, J., Castro, J.: Consistency measures for feature selection. J. Intell.
Inf. Syst. 30 (3), 273-292 (2008)
5. Battiti, R.: Using mutual information for selection features in supervised neural net learning.
IEEE Trans. Neural Netw. 5 (4), 537-550 (1994)
6. Blum, A.L., Langley, P.: Selection of relevant features and examples in machine learning. Artif.
Intell. 97 (1-2), 245-271 (1997)
7. Bolón-Canedo, V., Sánchez-Maroño, N., Alonso-Betanzos, A.: A review of feature selection
methods on synthetic data. Knowl. Inf. Syst. 34 (3), 483-519 (2013)
8. Brown, G., Pocock, A., Zhao,M.J., Luján,M.: Conditional likelihoodmaximisation: Aunifying
framework for information theoretic feature selection. J. Mach. Learn. Res. 13 , 27-66 (2012)
9. Cornelis, C., Jensen, R., Hurtado, G., Slezak, D.: Attribute selectionwith fuzzy decision reducts.
Inf. Sci. 180 (2), 209-224 (2010)
10. Dash, M., Liu, H.: Feature selection for classification. Intell. Data Anal. 1 (3), 131-156 (1997)
11. Dash, M., Liu, H.: Consistency-based search in feature selection. Artif. Intell. 151 (1-2), 155-
176 (2003)
12. Doak, J.: An Evaluation of Feature Selection Methods and Their Application to Computer
Security. Tech. rep, UC Davis Department of Computer Science (1992)
13. Elghazel, H., Aussem, A.: Unsupervised feature selection with ensemble learning. Machine
Learning, pp. 1-24. Springer, Berlin (2013)
14. Estévez, P., Tesmer, M., Perez, C., Zurada, J.: Normalized mutual information feature selection.
IEEE Trans. Neural Netw. 20 (2), 189-201 (2009)
15. Gunal, S., Edizkan, R.: Subspace based feature selection for pattern recognition. Inf. Sci.
178 (19), 3716-3726 (2008)
16. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res.
3 , 1157-1182 (2003)
17. Hu, Q., Yu, D., Liu, J., Wu, C.: Neighborhood rough set based heterogeneous feature subset
selection. Inf. Sci. 178 (18), 3577-3594 (2008)
18. Jain, A.: Feature selection: evaluation, application, and small sample performance. IEEE Trans.
Pattern Anal. Mach. Intell. 19 (2), 153-158 (1997)
19. Javed, K., Babri, H., Saeed, M.: Feature selection based on class-dependent densities for high-
dimensional binary data. IEEE Trans. Knowl. Data Eng. 24 (3), 465-477 (2012)
20. Jensen, R., Shen, Q.: Fuzzy-rough sets assisted attribute selection. IEEE Trans. Fuzzy Syst.
15 (1), 73-89 (2007)
21. Kalousis, A., Prados, J., Hilario, M.: Stability of feature selection algorithms: A study on
high-dimensional spaces. Knowl. Inf. Syst. 12 (1), 95-116 (2007)
 
Search WWH ::




Custom Search