Biomedical Engineering Reference
In-Depth Information
Table 2. Adaboost Algorithm
2 m , 2 l
1
[Initialization] w 1 ,i =
for c i = {− 1 , 1 }
respectively.
Step 1. Normalize weights
w t,i
j =1
w t,i
w t,i
Step 2. For each dimension j , train a classifier, h j which is restricted
to using that single random variable.
The
error
is
evaluated with respect
to w t ,
j
=
i w i |
h j ( x i )
c i |
.
Step 3. Choose the classifier, h t with the lowest error t .
Step 4. Update the weights:
w t +1 ,i = w t,i β e i
t
where e i =1for each well-classified feature and e i =0other-
wise. β t =
t
1 t
.
Step 5. Calculate the parameter α t =
log ( β t ).
Step 6. t = t +1.
If t
T go to Step 1
The final strong classifier is:
1
c r = t =1
α t h t ( x ) 0
h ( x )=
0
otherwise
where c r
is the confidence rate associated with the label.
learners is reduced, or increased otherwise. As a result, each classifier is centered
in the most difficult data up to that point.
Let the training set be composed by N pairs
is
the class of each multidimensional data point x i . Let w t,i be the weighting factor
at time t for data point x i . Also, let l and m be the number of data points for each
class. The adaboost algorithm is described in the Table 2.
Parameter α t is the weighting factor of each of the classifiers of the ensemble.
The loop ends whether the classification error of a the weak classifier is over 0.5;
{
x i ,c i }
, where c i = {− 1 , 1 }
 
Search WWH ::




Custom Search