Information Technology Reference
In-Depth Information
Note: Several of the methods based on “decision by a committee” for binary
classification problems have been extended to accommodate the multi-class case.
There are many topics and reviews of these algorithms for the binary case [41,
45].
6.1 Multiple Binary classifiers
The two common approaches to multi-class classification based on binary classifi-
ers are “one-against-all” and “one-against-one”.
6.1.1 One-Against-All Classification
Each class is compared to all the other classes, using a single classifier or a single
decision, i.e. for n classes, there are n classifiers. Each of the n classifiers can be
built of a number of weak classifiers, as in the AdaBoost algorithm [52].
The problem is the size and content of the training and testing sets of the “all”
group. This has to fully characterize all the “other” classes, and at the same time
be of a manageable size. In many cases it has to be of a similar size to the groups
of samples that represents the examined class, in order to avoid bias. In addition,
the second stage implies that the membership value or the classification status
must be comparable, which is not always true. In addition, contradictory
decisions or cases of no decision may occur, because the binary classifiers solve
separate classification problems [63]. Many of the methods based on binary clas-
sifiers also assume that a single set of features can characterize all the border or
transitions between the examined class and all the other classes.
6.1.2 One-Against-One (Pair-Wise) Classification
One-against-one classification refers to methods in which each class is compared to
each of the other classes, i.e. for n classes, there are n(n-1)/2 classifiers or compari-
sons [27, 54, 60]. The common method or methods for combining the results of the
binary classifiers or comparisons is to use various ranking or voting paradigms.
This approach can also be regarded as a graph in which each class has its own
connection or connections to each of the other classes. This means that the transi-
tions between classes can be emphasized rather than getting a complete characte-
rization of the class or its center. This becomes especially important when the
transitions between classes are continuous and only a threshold distinguishes be-
tween them. In this case, the direction of transition can also be significant. This
approach allows optimization of each pair-wise classifier in terms of both feature
set and classification algorithm. In addition, it is easier to construct balanced data-
sets for training and testing.
More classifiers are required in one-against-one classification in comparison to
the one-against-all classification. This can be a problem if the number of classes is
large. On the other hand, such classifiers can be more simple in terms of the num-
ber of feature, and the construction of the training and testing data, and they have
a potential to be more accurate for each pair of classes.
Search WWH ::




Custom Search