Graphics Reference
In-Depth Information
Table 8.2 Some of the most important prototype generation methods
Complete name
Abbr. name
Reference
Prototype nearest neighbor
PNN
[ 27 ]
Generalized editing using nearest neighbor
GENN
[ 99 ]
Learning vector quantization
LV Q
[ 98 ]
Chen algorithm
Chen
[ 29 ]
Modified Chang's algorithm
MCA
[ 12 ]
Integrated concept prototype learner
ICPL
[ 102 ]
Depuration algorithm
Depur
[ 140 ]
Hybrid LVQ3 algorithm
HYB
[ 90 ]
Reduction by space partitioning
RSP
[ 141 ]
Evolutionary nearest prototype classifier
ENPC
[ 54 ]
Adaptive condensing algorithm based on mixtures of Gaussians
MixtGauss
[ 113 ]
Self-generating prototypes
SGP
[ 52 ]
Adaptive Michigan PSO
AMPSO
[ 25 ]
Iterative prototype adjustment by differential evolution
IPADE
[ 151 ]
Differential evolution
DE
[ 152 ]
Several distance metrics have been used with kNN and PS, especially when work-
ing with categorical attributes [ 166 ]. There are some PS approaches which learn not
only the subset of the selected prototype, but also the distance metric employed
[ 59 , 128 ]. Also, PS is suitable for use on other types of dissimilarity based classifiers
[ 95 , 131 ].
Table 8.3 enumerates the main advances in these topics proposed in the literature.
8.5.3 Hybridizations with Other Learning Methods and Ensembles
On the one hand, this family includes all the methods which simultaneously use
instances and rules in order to compute the classification of a new object. If the
values of the object are within the range of a rule, its consequent predicts the class;
otherwise, if no rule matches the object, the most similar rule or instance stored in
the data base is used to estimate the class. Similarity is viewed as the closest rule
or instance based on a distance measure. In short, these methods can generalize an
instance into a hyperrectangle or rule [ 50 , 67 , 114 ].
On the other hand, this area refers to ensemble learning, where an IS method
is run several times and a classification decision is made according to the majority
class obtained over several subsets and any performance measure given by a learner
[ 5 , 71 ].
 
 
Search WWH ::




Custom Search