Graphics Reference
In-Depth Information
Algorithm 9 AdaBoost algorithm
Input: set of examples ( x 1 ,
y 1 )
,...,
( x N ,
y N ) where x i
X and y i ={−
1
, +
1
}
.
Let m be the number of negatives examples and l be the number of positive examples.
Initialize weights w 1 , n =
1
2 m ,
1
2 l
depending on the value of y n .
For t
=
1
,...,
Tx :
Normalize the weights w t , n so that n = 1 w t , n =
1.
1.
2.
For each feature f j , train a weak classifier h j .
3.
The
error
j
of
a
classifier h j
is
determined
with
respect
to
the
weights
w t , 1 ,...,
w t , N :
N
j =
w t , n |
h j ( x n )
y n |
n
4.
Choose the classifier h j with the lowest error
j and set ( h t , t )
=
( h j , j ).
1 e n
t
t
5.
Update the weights w t + 1 , n =
w t , n β
, where
β t =
and e n =
0, if example x n
1
t
is classified correctly by h t and 1 otherwise
The final strong classifier is given by:
T
1 f
t = 1 log 1
1
2
1
β t
T
h t ( x )
log (
);
H ( x )
=
β t
t = 1
0
otherwise.
MultiBoost
Given a set of training data ( x 1 ,
c 1 )
,...,
( x n ,
c n ) where x i is the input, and each output
c i
K is a class label. We use K to denote the number of possible class labels. Using
training data, MultiBoost, which is an extension of the original AdaBoost method, permits to
find a classification rule so that when given a new input x , we can assign it a class label c from
1
1
,...,
K . Let T ( x ) denote a weak multi-class classifier that assigns a class label to x . Then
the MultiBoost algorithm, called also AdaBoostM1, proceeds as reposted in Algorithm 10.
,...,
Algorithm 10 AdaBoostM1 algorithm
1
n ,
Initialize the observation weights w i =
i
=
1
,
2
,...,
n
.
For m
1to M :
- Fit a classifier T m ( x ) to the training data using weights w i
- Compute err m =
=
i = 1 nw i ( c i = T m ( x i ))
.
i = 1 nw i
log( 1 err m
err m
- Compute
α m =
)
α m · ( c i =
- Set w i
w i ·
exp(
T m ( x i ))), i
=
1
,...,
n .
- Re-normalize w i .
arg max k m = 1 M
α m ( T m ( x )
Output C ( x )
=
=
K ).
 
Search WWH ::




Custom Search