Database Reference
In-Depth Information
prominent member. They work by assigning a weight to each example.
Initially, all the examples have the same weight. In each iteration a base
(also named weak ) classifier is constructed, according to the distribution of
weights. Afterwards, the weight of each example is readjusted, based on the
correctness of the class assigned to the example by the base classifier. The
final result is obtained by weighted votes of the base classifiers.
A
is only for binary problems, but there are several methods of
extending AdaBoost to the multiclass case. Figure 1 shows AdaBoost.MH 1
[Schapire and Singer (1998)]. Each instance
daBoost
x i
belongs to a domain
X
and has an associated label
.
AdaBoost.MH associates a weight to each combination of examples and
labels. The base learner generates a base hypothesis
y i , which belongs to a finite label space
Y
h t ,accordingtothe
weights. A real value,
α t , the weight of the base classifier, is selected. Then,
the weights are readjusted. For
y, l ∈ Y, y
[
l
] is defined as
]= +1
l
y,
if
=
y
[
l
l
y.
1if
=
α t and how to train the weak
learner. The first question is addressed in [Schapire and Singer (1998)]. For
two class problems, If the base classifier returns a value in
Two open questions are how to select
{−
1
,
+1
}
,then
Given (
x 1 ,y 1 )
,...,
(
x m ,y m )where
x i ∈ X, y i ∈ Y
Initialize
D 1 (
i, l
)=1
/
(
mk
)
For
t
=1
,...,T
:
Train weak learner using distribution
D t
h t :
X × Y
Get weak hypothesis
R
α t
Choose
R
Update
D t +1 (
i, l
)=
D t (
i, l
)exp(
−α t y i [
l
]
h t (
x i ,l
))
/Z t
where
Z t is a normalization factor (chosen so that
D t +1
will be a distribution)
Output the final hypothesis
Fig. 1.
AdaBoost.MH [Schapire and Singer (1998)].
1 Although A daBoost .MH also considers multilabel problems (an example can be simul-
taneously of several classes), the version presented here does not include this case.
 
Search WWH ::




Custom Search