Information Technology Reference
In-Depth Information
References
[1] Chang, C.-C., Pao, H.-K., Lee, Y.-J.: An rsvm based two-teachers-one-student
semi-supervised learning algorithm. Neural Networks 25, 57-69 (2012)
[2] Chapelle, O., Zien, A.: Semi-supervised classification by low density separation
(2004)
[3] Fisher, R.A.: The use of multiple measurements in taxonomic problems. Annals
of Eugenics 7, 179-188 (1936)
[4] Huang, C.-M., Lee, Y.-J., Lin, D., Huang, S.-Y.: Model selection for support vector
machines via uniform design. Computational Statistics & Data Analysis 52(1),
335-346 (2007)
[5] Jolliffe, I.: Principal component analysis. Wiley Online Library (2005)
[6] Lee, Y.-J., Mangasarian, O.: SSVM: A smooth support vector machine for classi-
fication. Computational Optimization and Applications 20(1), 5-22 (2001)
[7] Lee, Y.-J., Mangasarian, O.L.: Rsvm: Reduced support vector machines. In:
Proceedings of the first SIAM International Conference on Data Mining, pp. 5-7.
SIAM (2001)
[8] Li, K.-C.: Sliced inverse regression for dimension reduction. Journal of the Amer-
ican Statistical Association 86(414), 316-327 (1991)
[9] Su, K.-Y.: Kernel sliced inverse regression (ksir) for semi-supervised learning.
Master thesis, NTUST (2014)
[10] Tang, W., Zhong, S.: Pairwise constraints-guided dimensionality reduction. In:
SDM Workshop on Feature Selection for Data Mining (2006)
[11] Wu, H.-M.: Kernel sliced inverse regression with applications to classification.
Journal of Computational and Graphical Statistics 17(3) (2008)
[12] Yeh, Y.-R., Huang, S.-Y., Lee, Y.-J.: Nonlinear dimension reduction with kernel
sliced inverse regression. IEEE Transactions on Knowledge and Data Engineer-
ing 21(11), 1590-1603 (2009)
[13] Zhang, D., Zhou, Z.-H., Chen, S.: Semi-supervised dimensionality reduction. In:
SDM, pp. 629-634. SIAM (2007)
Search WWH ::




Custom Search