Information Technology Reference
In-Depth Information
5. A.K. Jain, M.N. Murty, P.J. Flynn, Data clustering: a review, ACM Comput. Surv. 31(3)
(1999)
6. C. Williams, A MCMC approach to hierarchical mixture modelling. Int. Conf. Neural Inf.
Process. Sys. NIPS 13, 680-686 (1999)
7. R.M. Neal, Density modeling and clustering using Dirichlet diffusion trees. Bayesian Stat. 7,
619-629 (2003)
8. C. Kemp, T.L. Griffiths, S. Stromsten, J.B. Tenenbaum, Semi-supervised learning with trees,
Int. Conf. Neural Inf. Process. Sys. NIPS 17 (2003)
9. N. Vasconcelos, A. Lippman, Learning mixture hierarchies. Int. Conf. Neural Inf. Process.
Sys. NIPS 12, 606-612 (1998)
10. A. Stolcke, S. Omohundro, Hidden Markov model induction by Bayesian model merging. Int.
Conf. Neural Inf. Process. Sys. 6, 11-18 (1992)
11. J.D. Banfield, A.E. Raftery, Model-based Gaussian and non-gaussian clustering. Biometrics
43, 803-821 (1993)
12. S. Vaithyanathan, B. Dom, Model-based hierarchical clustering. Uncertain Artif. Intell. 16,
599-608 (2000)
13. E. Segal, D. Koller, D. Ormoneit, Probabilistic abstractions hierarchies. Int. Conf. Neural Inf.
Process. Sys. NIPS 15, 913-920 (2001)
14. M.F. Ramoni, P. Sebastiani, I.S. Kohane, Cluster analysis of gene expression dynamics. Natl
Acad Sci 99, 9121-9126 (2003)
15. N. Friedman, Pcluster: probabilistic agglomerative clustering of gene expression profiles.
Technical report, vol 80 (Herbew University 2003)
16. K.A.
Heller,
Z.
Ghahramani,
Bayesian
hierarchical
clustering,
ACM
International
Conference
Proceeding
Series.
Proceedings
of
the
22nd
international
conference
on
Machine learning, vol 119 (Bonn, Germany, 2005), pp 297-304
17. C.M. Bishop, M.E. Tipping, A hierarchical latent variable model for data visualization. IEEE
Trans. Pattern Anal. Mach. Intell. 20(3), 281-293 (1998)
18. M.E. Tipping, C.M. Bishop, Probabilistic principal component analysis. J. R. Stat. Soc. Series
B 61(3), 611-622 (1999)
19. M.E. Tipping, C.M. Bishop, Mixtures of probabilistic principal component analyzers, Neural
Comput. 11(2), 443-482 (1999)
20. H.J. Park, T.W. Lee, Capturing nonlinear dependencies in natural images using ICA and
mixture of Laplacian distribution. Neurocomputing 69, 1513-1528 (2006)
21. D.J.
Mackay,
Information
Theory,
Inference,
and
Learning
Algorithms
(Cambridge
University Press, Cambridge, 2004)
22. F.R. Bach, M.I. Jordan, Beyond independent components: trees and clusters. J. Mach. Learn.
Res. 3, 1205-1233 (2003)
23. A. Hyvärinen, P.O. Hoyer, M. Inki, Topographic independent component analysis. Neural
Comput. 13(7), 1527-1558 (2001)
24. R.S. Raghavan, A method for estimating parameters of K-distributed clutter, IEEE Trans.
Aerosp. Electron. Sys. 27(2), 268-275 (1991)
25. J.C. Bezdek, Pattern Recognition with Fuzzy Objective Function Algorithms (Plenum Press,
New York, 1981)
26. A.J. Bell, T.J. Sejnowski, The ''independent components'' of natural scenes are edge filters.
Vis. Res. 37(23), 3327-3338 (1997)
27. J.H. Van Hateren, A. van der Shaaf, Independent component filters of natural images
compared with simple cells in primary visual cortex. Proc R Soc Lond B 265, 359-366
(1998)
28. Y. Matsuda, K. Yamaguchi, Linear multilayer ICA generating hierarchical edge detectors.
Neural Comput. 19(1), 218-230 (2007)
29. T.W. Lee, M.S. Lewicki, T.J. Sejnowski, ICA mixture models for unsupervised classification
of non-gaussian classes and automatic context switching in blind signal separation. IEEE
Trans. Pattern Anal. Mach. Intell. 22(10), 1078-1089 (2000)
Search WWH ::




Custom Search