Graphics Reference
In-Depth Information
Tomek Condensed Nearest Neighbor (TCNN) [ 150 ]—Tomek presents two mod-
ifications based on the CNN algorithm. The method 1 is similar to CNN, but, when
an instance X i is misclassified (because its nearest neighbor in S , s is from the
opposite class), instead of adding it to S the method finds the nearest neighbor of
s which is a member of the same class of X i , and adds it to S .
The method 2 is also a modification of CNN, where instead of use instances in TR
to build S , only a subset of TR , F is employed. F is composed by the instances
of TR which have the same class as its nearest neighbors.
Modified Condensed Nearest Neighbor (MCNN) [ 46 ]—This algorithm is sim-
ilar to CNN but, instead of adding a instance to the set S when it is misclassified,
it flags all the instances misclassified and, when all the instances in TR have been
tested, a representative example of each class is added to S , generating it as the
centroid of the misclassified examples in each class. The process is conducted
iteratively until no instance in TR is misclassified.
Generalized Condensed Nearest Neighbor (GCNN) [ 28 ]—The GCNN algo-
rithm tries to improve the CNN algorithm. Firstly, the initial prototypes are selected
as the most voted from each class (considering a vote as to be the nearest instance
to other of the same class). Then, the CNN rule is applied, but a new instance X
is considered classified correctly only if its nearest neighbor X i in S is from its
same class, and the distance between X and X i is lower than dist , where dist is
the distance between X and its nearest enemy in S .
Fast Condensed Nearest Neighbor family (FCNN) [ 7 ]—The FCNN1 algorithm
starts by introducing in S the centroids of each class. Then, for each prototype p in
S , its nearest enemy inside its Voronoi region is found, and add to S . This process
is performed iteratively until no enemies are found on a single iteration.
Fast Condensed Nearest Neighbor 2 ( FCNN2 ): The FCNN2 algorithm is similar
to FCNN1 but, instead of adding the nearest enemy on each Voronoi region, is
added the centroid of the enemies found in the region.
Fast Condensed Nearest Neighbor 3 ( FCNN3 ): The FCNN3 algorithm is similar
to FCNN1 but, instead of adding one prototype per region in each iteration, only
one prototype is added (the one which belongs to the Voronoi region with most
enemies). In FCNN3, S is initialized only with the centroid of the most populated
class.
Fast Condensed Nearest Neighbor 4 ( FCNN4 ): The FCNN3 algorithm is similar
to FCNN2 but, instead of adding one prototype per region in each iteration, only
one centroid is added (the one which belongs to the Voronoi region with most
enemies). In FCNN4, S is initialized only with the centroid of the most populated
class.
Prototype Selection based on Clustering (PSC) [ 126 ]—To build the S set, the
PSC first employs the C -Means algorithm to extract clusters from the set TR of
training prototypes. Then, for each cluster G , if it is homogeneous (all prototypes
belongs to the same class), its centroid is added to S . If it is not homogenous, then
their majority class G m is computed, and every instance which do not belongs to
G m in the cluster is add to S , along with its nearest neighbor in class G m .
 
Search WWH ::




Custom Search