Graphics Reference
In-Depth Information
MaxNCN algorithm can be written as follows:
For each instance X i do
neighbors _ number
[
X i ]=
0
neighbor
=
next _ neighbor
(
X i )
While neighbor
.
class
==
X i .
class do
neighbors _ vector
[
X i ]=
Id
(
neighbor
)
neighbors _ number
[
X i ]++
neighbor
=
next _ neighbor
(
X i )
End while
End for
While Max _ neighbors
() >
0 do
EliminateNeighbors
(
id _ Max _ neighbors
)
End while
Reconsistent [ 112 ]—The Reconsistent algorithm is an enhanced version of the
Iterative MaxNCN. When it has been applied to the set TR the subset resulting is
processed by a condensing method (CNN), employing as reference set the original
training set TR .
Template Reduction KNN (TRKNN) [ 53 ]—The TRKNN method introduces the
concept of nearest neighbors chains. Every chain is assigned to one instance, and
it is built by finding the nearest neighbors of each element of the chain, which
belong alternatively to the class of the starting instance, or to a different class. The
chain is stopped when an element is selected twice to belong to the same chain.
By building the distances between the patterns in the chain a non-increasing
sequence is formed, thus the last elements of the chain will be near the decision
boundaries. The TRKNN method will employ this property to drop all instances
which are far away from the decision boundaries.
8.4.2 Edition Algorithms
These algorithms edit out noisy instances as well as close border class, leaving
smoother decision boundaries. They also retain all internal points, because an internal
instance may be labeled as the same class of its neighbors.
Considering their search direction they can be classified as:
8.4.2.1 Decremental
Edited Nearest Neighbor (ENN) [ 165 ]—Wilson developed this algorithm which
starts with S = TR and then each instance in S is removed if it does not agree with
the majority of its k nearest neighbors.
 
Search WWH ::




Custom Search