Information Technology Reference
In-Depth Information
176. Prokhorov, A.V.: Pearson Curves. In: Hazewinkel, M. (ed.) Encyclopaedia of
Mathematics. Kluwer Academic Pub. (2002)
177. Quinlan, R.: C4.5: Programs for Machine Learning. Morgan Kaufmann Pub-
lishers, Inc. (1993)
178. Raileanu, L., Stoffel, K.: Theoretical comparison between the Gini index and
information gain criteria. Annals of Mathematics and Artificial Intelligence 41,
77-93 (2004)
179. Rama, B., Jayashree, P., Jiwani, S.: A survey on clustering: Current status and
challenging issues. Int. Journal on Computer Science and Engineering 2(9),
2976-2980 (2010)
180. Rand, W.: Objective criteria for the evaluation of clustering methods. Journal
of the American Statistical Association 66, 846-850 (1971)
181. Rao, B.L.S.P.: Nonparametric Functional Estimation. Academic Press, Inc.
(1983)
182. Raudys, S., Pikelis, V.: On dimesionality, sample size, classification error, and
complexity of classification algorithm in pattern recognition. IEEE Trans. Pat-
tern Analysis and Machine Intelligence 2(3), 242-252 (1980)
183. Rényi, A.: Probability Theory. Elsevier Science Pub. Co. Inc. (1970)
184. Reza, F.: An Introduction to Information Theory. Dover Pub. (1994)
185. Richard, M., Lippmann, R.: Neural network classifiers estimate Bayesian a
posteriori probabilities. Neural Computation 3, 461-483 (1991)
186. Riedmiller, M., Braun, H.: A direct adaptive method for faster backpropaga-
tion learning: The Rprop algorithm. In: IEEE Int. Conf. on Neural Networks,
pp. 586-591 (1993)
187. Robinson, A., Fallside, F.: The utility driven dynamic error propagation net-
work. Tech. Rep. CUED/F-INFENG/TR.1, Cambridge University, Engineer-
ing Department (1987)
188. Rokach, L., Maimon, O.: Decision trees. In: Maimon, O., Rokach, L. (eds.)
Data Mining and Knowledge Discovery Handbook. Springer (2005)
189. Rosasco, L., De Vito, E., Caponnetto, A., Piana, M.: Are loss functions all
the same? Neural Computation 16(5), 1063-1076 (2004)
190. Rosenblatt, F.: The perceptron: a probabilistic model for information storage
and organization in the brain. Psychological Review 65, 368-408 (1958)
191. Rueda, L.: A one-dimensional analysis for the probability of error of linear
classifiers for normally distributed classes. Pattern Recognition 38(8), 1197-
1207 (2005)
192. Rumelhart, D., Hinton, G., Williams, R.: Learning representations by back-
propagation errors. Nature 323, 533-536 (1986)
193. Saerens, M., Latinne, P., Decaestecker, C.: Any reasonable cost function can
be used for a posteriori probability approximation. IEEE Trans. on Neural
Networks 13(5), 1204-1210 (2002)
194. Safavian, S., Landgrebe, D.: A survey of decision tree classifier methodology.
IEEE Trans. on Systems Man and Cybernetics 21(3), 660-674 (1991)
195. Salzberg, S.: On comparing classifiers: Pitfalls to avoid and a recommended
approach. Data Mining and Knowledge Discovery 1, 317-327 (1997)
196. Santamaría, I., Pokharel, P., Príncipe, J.: Generalized correlation function:
definition, properties, and application to blind equalization. IEEE Trans. on
Signal Processing 54(6-1), 2187-2197 (2006)
197. Santos, J.: Repository of data sets used on technical report Human Clustering
on Bi-dimensional Data: An Assessment (2005),
http://www.dema.isep.ipp.pt/~jms/datasets
198. Santos, J.: Data classification with neural networks and entropic criteria.
Ph.D. thesis, University of Porto (2007)
 
Search WWH ::




Custom Search