Information Technology Reference
In-Depth Information
153. Marques de Sá, J., Sebastião, R., Gama, J., Fontes, T.: New results on min-
imum error entropy decision trees. In: San Martin, C., Kim, S.-W. (eds.)
CIARP 2011. LNCS, vol. 7042, pp. 355-362. Springer, Heidelberg (2011)
154. Matula, D.: Cluster analysis via graph theoretic techniques. In: Mullin, R.,
Reid, K., Roselle, D. (eds.) Proc. Luisiana Conf. on Combinatorics, Graph
Theory and Computing, pp. 199-212 (1970)
155. Matula, D.: k-components, clusters and slicings in graphs. SIAM J. Appl.
Math. 22(3), 459-480 (1972)
156. McCulloch, W., Pitts, W.: A logical calculus of the ideas immanent in nervous
activity. Bulletin of Mathematical Biophysics 5, 115-133 (1943)
157. Meila, M., Shi, J.: A random walks view of spectral segmentation. In: 8th Int.
Workshop on Artificial Intelligence and Statistics, pp. 8-11 (2001)
158. Mergel, V.: Test of goodness of fit for the inverse-Gaussian distribution. Math.
Commun. 4(2), 191-195 (1999)
159. Mokkadem, A.: Estimation of the entropy and information of absolutely con-
tinuous random variables. IEEE Trans. Information Theory 35(1), 193-196
(1989)
160. Möller, M.: E cient training of feed-forward neural networks. Ph.D. thesis,
Computer Science Department, Aarhus University (1993)
161. Murthy, S.: Automatic construction of decision trees from data: A multi-
disciplinary survey. Data Mining and Knowledge Discovery 2(4), 345-389
(1998)
162. Nadarajah, S., Zografos, K.: Formulas for Rényi information and related mea-
sures for univariate distributions. Inf. Sci. 155(1-2), 119-138 (2003)
163. NCI60: Stanford NCI60 cancer microarray project,
http://genome-www.stanford.edu/nci60/ (2000)
164. Neelakanta, P., Abusalah, S., Groff, D., Sudhakar, R., Park, J.: Csiszar's gen-
eralized error measures for gradient-descent-based optimizations in neural net-
works using the backpropagation algorithm. Connection Science 8(1), 79-114
(1996)
165. Nelder, J., Mead, R.: A simplex method for function minimization. Computer
Journal 7, 308-313 (1965)
166. Ng, A., Jordan, M., Weiss, Y.: On spectral clustering: Analysis and an algo-
rithm. In: Advances in Neural Information Processing Systems, vol. 14 (2001)
167. Ng, G., Wahab, A., Shi, D.: Entropy learning and relevance criteria for neural
network pruning. Int. Journal of Neural Systems 13(5), 291-305 (2003)
168. Papoulis, A.: Probability, Random Variables and Stochastic Processes.
McGraw-Hill Co. Inc. (1991)
169. Parzen, E.: On the estimation of a probability density function and the mode.
Annals of Mathematical Statistics 33, 1065-1076 (1962)
170. Pelagotti, A., Piuri, V.: Entropic analysis and incremental synthesis of mul-
tilayered feedforward neural networks. Int. Journal of Neural Systems 8(5-6),
647-659 (1997)
171. Pérez-Ortiz, J., Gers, F., Eck, D., Schmidhuber, J.: Kalman filters improve
LSTM network performance in problems unsolvable by traditional recurrent
nets. Neural Networks 16(2), 241-250 (2003)
172. Pipberger, H., Arms, R., Stallmann, F.: Automatic screening of normal and ab-
normal electrocardiograms by means of digital electronic computer. In: Proc.
Soc. Exp. Biol. Med., pp. 106-130 (1961)
173. Porter, M.: An algorithm for sux stripping. Program 14(3), 130-137 (1980)
174. Principe, J.: Information Theoretic Learning: Rényi's Entropy and Kernel
Perspectives. Springer (2010)
175. Principe, J., Xu, D.: Learning from examples with information theoretic cri-
teria. Journal of VLSI Signal Processing Systems 26(1-2), 61-77 (2000)
 
Search WWH ::




Custom Search