Database Reference
In-Depth Information
Table 5.4. Results for the evolutionary IS algorithms on PS.
Database
GGA %
SGA %
CHC
%
PBIL
%
Avg
Avg%
Clevel.
49.21 96.00 47.65 94.22 52.90
98.69
51.57
97.61
50.33
96.63
Glass
71.80 89.93 69.98 88.41 69.14
93.15
65.39
92.84
69.08
91.08
Iris
96.00 95.56 95.15 95.84 95.33
96.59
96.00
96.59
95.62
96.15
LED24D.
32.05 88.67 30.08 86.12 31.87
92.05
37.17
90.61
32.79
89.36
LED7D.
64.56 95.58 63.66 94.72 65.39
96.04
62.80
90.18
64.10
94.13
Lymph.
48.33 89.72 48.25 87.84 42.67
94.30
49.40
93.32
47.16
91.30
Monk
64.09 91.18 63.41 91.34 67.37
98.05
62.28
96.32
64.29
94.22
Pima
69.79 94.89 68.01 93.99 73.17
97.80
71.74
95.10
70.68
95.45
Wine
71.96 94.88 69.97 94.83 74.77
96.82
69.74
96.63
71.61
95.79
Wiscon.
97.07 98.93 96.12 96.43 95.91
99.40
96.05
98.32
96.29
98.27
Average
66.49 93.53 65.23 92.37 66.85
96.29
66.21
94.75
66.20
94.24
We wish to point out the following conclusions about the evolutionary IS
algorithms for PS:
z We should highlight the high values reached by the EAs for the reduction
percentage (around 94%). This is a great advantage with regard to the
percentage of the best classical algorithm, DROP3 (73.26%).
z The average behavior of all the EAs is 66.20 for accuracy and 94.24 for
reduction percentage. These values are better than the ones for all the classical
algorithms.
z Although, in general, the results of all the EAs are very similar, the CHC
algorithm arises as the best one.
5.7.2 Analysis and Results for Training Set Selection
As we mentioned in Section 5.6.2, for each data set and IS algorithm executed,
two pairs of training selected set and test set are obtained ( S i s i ), I =1, 2. These
pairs are then used by two learning algorithms, the 1-NN classifier and the C4.5.
Tables 5.5 and 5.7 show the results of these algorithms on the pairs ( S i s i )
obtained from the classical IS algorithms. Tables 5.6 and 5.8 have the same
information for the case of the pairs ( S i
s i ) obtained from the evolutionary IS
algorithms:
z The column “1-NN” has the test accuracy obtained from the classification of
the elements in s i by an 1-NN classifier that uses the set of prototypes S i .
z The column “C4.5” contains the test accuracy achieved by classifying the
elements in s i by means of the decision tree learned by the C4.5 from S i .
z The column “%” reports the reduction percentage of S i with regards to the
corresponding complete data set.
Again, for the case of the EAs, three columns were included with the average
results for all of these algorithms.
 
Search WWH ::




Custom Search