Databases Reference
In-Depth Information
able to verify the substantial differences in performance among the mentioned
methods. Because of random nature in tested classifier building methods, this
research attempts to repeat LOOCV ten times for all random based meth-
ods (both CMMC and Random Forests) and then computes average accuracy
for all runs. As indicated in [8], 1,000 artificial data points are generated for
CMMC-1 and CMMC-2 methods, while CMMC-3 uses the same number of
artificial data points in every “upgraded” leaf.
3.4 Results
This section highlights the key findings that are obtained by applying the
adapted CMM method on four microarray datasets available in public domain.
Table 1 shows the accuracy comparison. The tests show that all three proposed
methods gained some accuracy comparing to a simple decision tree, but they
are still lacking a lot of accuracy compared to ensemble of classifiers.
To keep the complexity level low for built decision trees, we used pruning
in all decision trees that are used in the experiment. Average complexity (i.e.
number of rules) of decision trees is presented in Table 2. We do not present the
complexity of Random Forest Method as it can be simply estimated as approx-
imately 100 times larger than the simple decision tree and therefore completely
unacceptable for interpretation. The most significant fact revealed from the
Table 2 is low rule complexity of CMMC-2 generated decision trees, especially
when compared to CMMC-1 trees. Trees from our second proposed method
Tabl e 1 . Comparison of accuracy for decision tree (C4.5), proposed decision tree
building methods and Random Forests (RF)
Dataset
C4.5
CMMC1
CMMC2
CMMC3
RF
amlall1
80.56
90.40
89.20
87.58
97.92
amlall2
79.17
91.29
88.70
88.44
98.30
amlall3
79.17
90.85
87.94
88.30
98.80
amlallAvg
79.63
90.85
88.61
88.11
98.34
breast1
66.67
73.89
67.19
72.85
85.84
breast2
61.54
64.74
65.62
65.66
81.00
breast3
71.79
66.49
66.78
65.04
90.21
breastavg
66.67
68.37
66.53
67.97
85.68
lung1
96.13
96.37
97.55
96.64
99.45
lung2
97.79
96.61
97.95
97.06
98.97
lung3
98.90
96.53
98.58
97.28
99.45
lungavg
97.61
96.50
98.03
96.99
99.29
mll1
79.17
88.89
89.48
87.08
97.62
mll2
88.89
88.29
88.89
91.96
94.64
mll3
84.72
88.29
88.89
87.62
96.03
mllAvg
84.26
88.49
89.09
88.87
96.10
Average
82.04
86.05
85.56
85.49
94.85
 
Search WWH ::




Custom Search