Database Reference
In-Depth Information
S. Cohen, Rokach L., and Maimon O., Decision tree instance space decomposition
with grouped gain-ratio, Information Science 177(17):3592-3612, 2007.
Coppock D. S., Data modeling and mining: Why lift?, Published in DM Review
online, June 2002.
Crawford S. L., Extensions to the CART algorithm, International Journal of
ManMachine Studies 31(2):197-217, August, 1989.
Cunningham P., and Carney J., Diversity versus quality in classification ensembles
basedonfeatureselection,In: Proc. ECML 2000, 11th European Conf.
On Machine Learning ,R.L.deMantaras, and E. Plaza (eds.), Barcelona,
Spain, LNCS 1810, Springer, pp. 109-116, 2000.
Curtis H. A., A New Approach to the Design of Switching Functions ,Van
Nostrand: Princeton, 1962.
Dahan H., Cohen S., Rokach L., and Maimon O., Proactive Data Mining with
Decision Trees , Springer, 2014.
Dhillon I., and Modha D., Concept decomposition for large sparse text data using
clustering. Machine Learning 42:143-175, 2001.
Dempster A. P., Laird N. M., and Rubin D. B., Maximum likelihood from
incomplete data using the EM algorithm, JournaloftheRoyalStatistical
Society 39(B): 1977.
Derbeko P., El-Yaniv R., and Meir R., Variance optimized bagging, European
Conference on Machine Learning , 2002.
Dietterich T. G., Approximate statistical tests for comparing supervised classifi-
cation learning algorithms. Neural Computation 10(7):1895-1924, 1998.
Dietterich T. G., An experimental comparison of three methods for constructing
ensembles of decision trees: Bagging, boosting and randomization, Machine
Learning 40(2):139-157, 2000a.
Dietterich T., Ensemble methods in machine learning, In First International
Workshop on Multiple Classifier Systems , J. Kittler, and F. Roll (eds.),
Lecture Notes in Computer Science, pp. 1-15, Springer-Verlag, 2000b.
Dietterich T. G., and Bakiri G., Solving multiclass learning problems via error-
correcting output codes, Journal of Artificial Intelligence Research 2:263-
286, 1995.
Dietterich T. G., and Kong E. B., Machine learning bias, statistical bias, and
statistical variance of decision tree algorithms, Technical Report, Oregon
State University, 1995.
Dietterich T. G., and Michalski R. S., A comparative review of selected methods
for learning from examples, Machine Learning, an Artificial Intelligence
Approach 1:41-81, 1983.
Dietterich T. G., Kearns M., and Mansour Y., Applying the weak learning frame-
work to understand and improve C4.5, In Proceedings of the Thirteenth
International Conference on Machine Learning , pp. 96-104, San Francisco:
Morgan Kaufmann, 1996.
Dimitriadou E., Weingessel A., and Hornik K., A Cluster Ensembles Framework,
Design and Application of Hybrid Intelligent Systems ,Amsterdam,The
Netherlands: IOS Press, 2003.
Search WWH ::




Custom Search