Graphics Reference
In-Depth Information
5.4.1.1 Decisions Combination in Multiple Classifiers Systems
As has been previously mentioned, parallel approaches need a posterior phase of
combination after the evaluation of a given example by all the classifiers. Many deci-
sions combination proposals can be found in the literature, such as the intersection
of decision regions [ 29 ], voting methods [ 62 ], prediction by top choice combina-
tions [ 91 ], use of the Dempster-Shafer theory [ 58 , 97 ] or ranking methods [ 36 ]. In
concrete, we will study the following four combination methods for the MCSs built
with heterogeneous classifiers:
1. Majority vote (MAJ) [ 62 ] This is a simple but powerful approach, where each
classifier gives a vote to the predicted class and the one with the most votes is
chosen as the output.
2. Weighted majority vote (W-MAJ) [ 80 ] Similarly to MAJ, each classifier gives
a vote for the predicted class, but in this case, the vote is weighted depending on
the competence (accuracy) of the classifier in the training phase.
3. Naïve Bayes [ 87 ] This method assumes that the base classifiers are mutually
independent. Hence, the predicted class is the one that obtains the highest posterior
probability. In order to compute these probabilities, the confusion matrix of each
classifier is considered.
4. Behavior-Knowledge Space (BKS) [ 38 ] This is a multinomial method that
indexes a cell in a look-up table for each possible combination of classifiers
outputs. A cell is labeled with the class to which the majority of the instances in
that cell belong to. A new instance is classified by the corresponding cell label;
in case the cell is not labeled or there is a tie, the output is given by MAJ.
We always use the same training data set to train all the base classifiers and
to compute the parameters of the aggregation methods, as is recommended in [ 53 ].
Using a separate set of examples to obtain such parameters can imply some important
training data to be ignored and this fact is generally translated into a loss of accuracy
of the final MCS built.
In MCSs built with heterogeneous classifiers, all of them may not return a con-
fidence value. Even though each classifier can be individually modified to return a
confidence value for its predictions, such confidences will come from different com-
putations depending on the classifier adapted and their combination could become
meaningless. Nevertheless, in MCSs built with the same type of classifier, this fact
does not occur and it is possible to combine their confidences since these are homo-
geneous among all the base classifiers [ 53 ]. Therefore, in the case of bagging, given
that the same classifier is used to train all the base classifiers, the confidence of the
prediction can be used to compute a weight and, in turn, these weights can be used
in a weighted voting combination scheme.
 
Search WWH ::




Custom Search