Information Technology Reference
In-Depth Information
REFERENCES
1. N. V. Chawla, D. A. Cieslak, L. O. Hall, and A. Joshi, “Automatically counter-
ing imbalance and its empirical relationship to cost,” Data Mining and Knowledge
Discovery , vol. 17, no. 2, pp. 225-252, 2008.
2. M. Kubat and S. Matwin, “Addressing the curse of imbalanced training sets:
One-sided selection,” in Machine Learning-International Workshop then Conference ,
(Nashville, TN, USA), pp. 179-186. Morgan Kaufmann, 1997.
3. I. Tomek, “An experiment with the edited nearest-neighbor rule,” IEEE Transactions
on Systems, Man, and Cybernetics Part C , vol. 6, no. 6, pp. 448-452, 1976.
4. P. E. Hart, N. J. Nilsson, and B. Raphael, “A formal basis for the heuristic determina-
tion of minimum cost paths,” IEEE Transactions on Systems Science and Cybernetics ,
vol. 4, no. 2, pp. 100-107, 1968.
5. J. Laurikkala, “Improving identification of difficult small classes by balancing class
distribution,” Artificial Intelligence in Medicine , (Cascais, Portugal), pp. 63-66,
Springer-Verlag, vol. 2101, 2001.
6. N. V. Chawla, K. W. Bowyer, L. O. Hall, and W. P. Kegelmeyer, “SMOTE: Synthetic
minority over-sampling technique,” Journal of Artificial Intelligence Research , vol.
16, pp. 321-357, 2002.
7. H. Han, W. Y. Wang, and B. H. Mao, “Borderline-smote: A new over-sampling
method in imbalanced data sets learning,” in Advances in Intelligent Computing ,
(Hefei, China), vol. 3644, pp. 878-887, Springer-Verlag, 2005.
8. C. Bunkhumpornpat, K. Sinapiromsaran, and C. Lursinsap, “Safe-level-smote: Safe-
level-synthetic minority over-sampling technique for handling the class imbalanced
problem,” in Pacific-Asia Conference on Advances in Knowledge Discovery and Data
Mining , (Bangkok, Thailand), pp. 475-482, Springer-Verlag, 2009.
9. C. Bunkhumpornpat, K. Sinapiromsaran, and C. Lursinsap, “DBSMOTE: Density-
based synthetic minority over-sampling technique,” Applied Intelligence , vol. 36, pp.
1-21, 2011.
10. T. Jo and N. Japkowicz, “Class imbalances versus small disjuncts”, ACM SIGKDD
Explorations Newsletter , vol. 6, no. 1, pp. 40-49, 2004.
11. N. Japkowicz “Learning from imbalanced data sets: A comparison of various strate-
gies,” in AAAI Workshop on Learning from Imbalanced Data Sets ,(Austin,Texas),
vol. 68, AAAI Press, 2000.
12. G. E. Batista, R. C. Prati, and M. C. Monard, “A study of the behavior of several
methods for balancing machine learning training data,” ACM SIGKDD Explorations
Newsletter , vol. 6, no. 1, pp. 20-29, 2004.
13. T. G. Dietterich, “Ensemble methods in machine learning,” Lecture Notes in Computer
Science , vol. 1857, pp. 1-15, 2000.
14. L. K. Hansen and P. Salamon, “Neural network ensembles,” IEEE Transactions on
Pattern Analysis and Machine Intelligence , vol. 12, no. 10, pp. 993-1001, 1990.
15. L. Breiman, “Bagging predictors,” Machine Learning , vol. 24, no. 2, pp. 123-140,
1996.
16. Y. Freund and R. Schapire. “Experiments with a new boosting algorithm,” in Thir-
teenth International Conference on Machine Learning , (Bari, Italy), pp. 148-156,
Morgan Kaufmann, 1996.
Search WWH ::




Custom Search