Information Technology Reference
In-Depth Information
60. Duin, R.: Dutch handwritten numerals,
http://www.ph.tn.tudelft.nl/~duin
61. Dwass, M.: Probability: Theory and Applications. W. A. Benjamin, Inc. (1970)
62. Ebrahimi, N., Soofi, E., Soyer, R.: Information measures in perspective. Int.
Statistical Review 78, 383-412 (2010)
63. Elman, J.: Finding structure in time. Tech. Rep. CRL 8801, University of
California, Center for Research in Language, San Diego (1988)
64. Ennaji, A., Ribert, A., Lecourtier, Y.: From data topology to a modular clas-
sifier. Int. Journal on Document Analysis and Recognition 6(1), 1-9 (2003)
65. Erdogmus, D., Hild II, K., Principe, J.: Blind source separation using Rényi's
α− marginal entropies. Neurocomputing 49, 25-38 (2002)
66. Erdogmus, D., Principe, J.: Comparison of entropy and mean square error
criteria in adaptive system training using higher order statistics. In: Int. Conf.
on ICA and Signal Separation, Helsinki, Finland, pp. 75-80 (2000)
67. Erdogmus, D., Principe, J.: An error-entropy minimization algorithm for su-
pervised training of nonlinear adaptive systems. IEEE Trans. on Signal Pro-
cessing 50(7), 1780-1786 (2002)
68. Esposito, F., Malerba, D., Semeraro, G., Kay, J.: A comparative analysis of
methods for pruning decision trees. IEEE Trans. on Pattern Analysis and
Machine Intelligence 19(5), 476-493 (1997)
69. Ester, M., Kriegel, H.P., Sander, J., Xu, X.: A density-based algorithm for
discovering clusters in large spatial databases with noise. In: 2nd Int. Conf.
on Knowledge Discovery and Data Mining, pp. 226-231. AAAI Press (1996)
70. Álvarez Estévez, D., Príncipe, J., Moret-Bonillo, V.: Neuro-fuzzy classification
using the correntropy criterion: Application to sleep depth estimation. In:
Arabnia, H., de la Fuente, D., Kozerenko, E., Olivas, J., Chang, R., LaMonica,
P., Liuzzi, R., Solo, A. (eds.) IC-AI, pp. 9-15. CSREA Press (2010)
71. Faivishevsky, L., Goldberger, J.: A nonparametric information theoretic clus-
tering algorithm. In: Proc. of the 27th Int. Conf. on Machine Learning, pp.
351-358 (2010)
72. Fhalman, S.: Faster-learning variations on back-propagation: An empirical
study. In: Touretzky, D., Hinton, G., Sejnowski, T. (eds.) Connectionist Mod-
els Summer School, pp. 38-51. Morgan Kaufmann (1988)
73. Fiedler, M.: A property of eigenvectors of nonnegative symmetric matrices and
its application to graph theory. Czechoslovak Mathematical Journal 25(100),
619-633 (1975)
74. Foley, D.: Considerations of sample and feature size. IEEE Trans. Information
Theory 18(5), 618-626 (1972)
75. Forina, M., Armanino, C.: Eigenvector projection and simplified non-linear
mapping of fatty acid content of italian olive oils. Ann. Chim. 72, 127-155
(1981)
76. Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Aca-
demic Press Professional, Inc. (1990)
77. Fukunaga, K., Hostetler, L.: The estimation of the gradient of a density func-
tion, with applications in pattern recognition. IEEE Trans. in Information
Theory 21, 32-40 (1975)
78. Funahashi, K.: Multilayer neural networks and Bayes decision theory. Neural
Networks 11(2), 209-213 (1998)
79. García, S., Fernández, A., Luengo, J., Herrera, F.: Advanced nonparametric
tests for multiple comparisons in the design of experiments in computational
intelligence and data mining: Experimental analysis of power. Information
Sciences 180, 2044-2064 (2010)
80. Gelfand, S., Ravishankar, C., Delp, E.: An iterative growing and pruning al-
gorithm for classification tree design. IEEE Trans. on Pattern Analysis and
Machine Intelligence 13(2), 163-174 (1991)
 
Search WWH ::




Custom Search