Database Reference
In-Depth Information
Classifiers Composer
Unlabeled
Tuples
Predicted
Labels
Classifier 1
Classifier 2
Classifier T
Inducer 1
Inducer 2
...
Inducer T
Dataset 1
Dataset 2
Dataset T
...
Dataset
Manipulator
Dataset
Manipulator
Dataset
Manipulator
Training Set
Fig. 9.7 Model-guided instance selection diagram.
order to achieve a higher accuracy than the weak learners classifiers would
have had.
Freund and Schapire (1996) introduced the AdaBoost algorithm. The
main idea of this algorithm is to assign a weight in each example in the
training set. In the beginning, all weights are equal, but in every round,
the weights of all misclassified instances are increased while the weights
of correctly classified instances are decreased. As a consequence, the weak
learner is forced to focus on the dicult instances of the training set. This
procedure provides a series of classifiers that complement one another.
The pseudo-code of the AdaBoost algorithm is described in Figure 9.8.
The algorithm assumes that the training set consists of m instances, labeled
as
1 or +1. The classification of a new instance is made by voting on all
classifiers
{
M t }
, each having a weight of α t . Mathematically, it can be
written as:
H ( x )= sign T
M t ( x ) .
α t ·
(9.16)
t =1
 
Search WWH ::




Custom Search