Graphics Reference
In-Depth Information
22. Kira, K., Rendell, L.A.: A practical approach to feature selection. In: Proceedings of the Ninth
International Workshop on Machine Learning, ML92, pp. 249-256 (1992)
23. Kohavi, R., John, G.: Wrappers for feature subset selection. Artif. Intell. 97 (1-2), 273-324
(1997)
24. Koller, D., Sahami, M.: Toward optimal feature selection. In: Proceedings of the Thirteenth
International Conference on Machine Learning, pp. 284-292 (1996)
25. Kononenko, I.: Estimating attributes: Analysis and extensions of relief. In: Proceedings of the
European Conference on Machine Learning on Machine Learning, ECML-94, pp. 171-182
(1994)
26. Kudo, M., Sklansky, J.: Comparison of algorithms that select features for pattern classifiers.
Pattern Recognit. 33 (1), 25-41 (2000)
27. Kwak, N., Choi, C.H.: Input feature selection by mutual information based on parzen window.
IEEE Trans. Pattern Anal. Mach. Intell. 24 (12), 1667-1671 (2002)
28. Kwak, N., Choi, C.H.: Input feature selection for classification problems. IEEE Trans. Neural
Netw. 13 (1), 143-159 (2002)
29. Liu, H., Motoda, H.: Feature Selection for Knowledge Discovery and Data Mining. Kluwer
Academic, USA (1998)
30. Liu, H., Motoda, H., Dash, M.: A monotonic measure for optimal feature selection. In: Pro-
ceedings of European Conference of Machine Learning, Lecture Notes in Computer Science
vol. 1398, pp. 101-106 (1998)
31. Liu, H., Setiono, R.: A probabilistic approach to feature selection - a filter solution. In: Pro-
ceedings of the International Conference on Machine Learning (ICML), pp. 319-327 (1996)
32. Liu, H., Sun, J., Liu, L., Zhang, H.: Feature selection with dynamic mutual information. Pattern
Recognit. 42 (7), 1330-1339 (2009)
33. Liu, H., Yu, L.: Toward integrating feature selection algorithms for classification and clustering.
IEEE Trans. Knowl. Data Eng. 17 (4), 491-502 (2005)
34. Maldonado, S., Weber, R.: A wrapper method for feature selection using support vector
machines. Inf. Sci. 179 (13), 2208-2217 (2009)
35. Mitra, P., Murthy, C., Pal, S.: Unsupervised feature selection using feature similarity. IEEE
Trans. Pattern Anal. Mach. Intell. 24 (3), 301-312 (2002)
36. Modha, D., Spangler, W.: Feature weighting on k-means clustering. Mach. Learn. 52 (3), 217-
237 (2003)
37. Mucciardi, A.N., Gose, E.E.: A Comparison of Seven Techniques for Choosing Subsets of
Pattern Recognition Properties, pp. 1023-1031. IEEE, India (1971)
38. Narendra, P.M., Fukunaga, K.: A branch and bound algorithm for feature subset selection.
IEEE Trans. Comput. 26 (9), 917-922 (1977)
39. Nguyen, M., de la Torre, F.: Optimal feature selection for support vector machines. Pattern
Recognit. 43 (3), 584-591 (2010)
40. Oh, I.S., Lee, J.S., Moon, B.R.: Hybrid genetic algorithms for feature selection. IEEE Trans.
Pattern Anal. Mach. Intell. 26 (11), 1424-1437 (2004)
41. Opitz, D.W.: Feature selection for ensembles. In: Proceedings of the National Conference on
Artificial Intelligence, pp. 379-384 (1999)
42. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: Criteria of max-
dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell.
27 (8), 1226-1238 (2005)
43. Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, USA
(1993)
44. Raymer, M., Punch, W., Goodman, E., Kuhn, L., Jain, A.: Dimensionality reduction using
genetic algorithms. IEEE Trans. Evolut. Comput. 4 (2), 164-171 (2000)
45. Robnik-Łikonja, M., Kononenko, I.: Theoretical and empirical analysis of relieff and rrelieff.
Mach. Learn. 53 (1-2), 23-69 (2003)
46. Rodriguez-Lujan, I., Huerta, R., Elkan, C., Cruz, C.: Quadratic programming feature selection.
J. Mach. Learn. Res. 11 , 1491-1516 (2010)
Search WWH ::




Custom Search