Database Reference
In-Depth Information
All the EAs use the solution representation and fitness function presented in
Sections 5.5.1 and 5.5.2. All the algorithms assume k = 1.
5.6.3.2 Instance Selection - Training Set Selection
For IS-TSS, we have run two classical IS algorithms: MCS and DROP1. MCS is
chosen because it presents the best classification accuracy on test data, and
DROP1 is selected because it has the best reduction of training set.
The parameters used for EAs are:
z The population size of GGA is 20 chromosomes. The crossover rate is 1, and
two mutation rates were considered: 0.01 for changing 1 to 0, and 0.001 in the
contrary case. GGA was run during 500 generations.
z The parameters of SGA are the same but considering 10,000 offspring
evaluations.
z The population size of the CHC algorithm was 20 chromosomes, and it was
executed during 500 generations.
z The parameters for PBIL were: N samples = 20, LR = 0.005, P m = 0.01, and
Mut_Shif = 0.01. This algorithm completed 500 iterations.
For IS-TSS, we have increased the population size due to the greater
complexity of the search space.
The solution representation and fitness function used by all the EAs are the
ones in Sections 5.5.1 and 5.5.2, respectively. All the algorithms use the 1-NN for
the fitness function.
5.7 Analysis of the Experiments
5.7.1 Analysis and Results for Prototype Selection
Tables 5.3 and 5.4 show the results obtained by the classical algorithms and the
evolutionary IS algorithms, respectively:
z The average test accuracy over the 10 trials is reported for each algorithm on
each data set.
z The average reduction percentage from the initial training sets is also reported
for each experiment under the column “%.”
Furthermore, to observe the level of robustness achieved by all the EAs, we
have included in Table 5.4 two columns with the average results of all of them.
Search WWH ::




Custom Search