Information Technology Reference
In-Depth Information
Algorithm 7.4 Selective accommodation of previous minority class examples
Inputs:
1: current timestamp t .
2: current training data chunk:
={ ( x 1 ,y 1 ),...,( x m ,y m ) }
3: current data set under evaluation:
S
(t)
x 1 ,..., x n }
T
(t)
={
4: minority class data queue:
Q
5: soft-typed base classifier: L
6: selective accommodation method: d
/* d could be SERA , MuSeRA ,or REA */
7: post-balanced ratio f
/* desired minority class to majority class ratio. */
8: hypotheses set H
={
h 1 ,h 2 ,...,h t 1 }
Procedure:
9: for t :1 ... do
10:
(t)
(t) , N
(t)
S
←{ P
}
(t)
(t)
/* Assume || P
|| = p and || N
|| = q */
if ( || P
(t)
|| + || Q || )/ || S
(t)
|| < = f then
11:
L( { S
(t) , Q } )
12:
h t
13:
else
14:
n f ×|| S
t
|| − || P
t
||
15:
m
←{}
16:
for x j
Q
do
17:
if d = REA then
(t) )
/* calculates the k -nearest neighbors of x j within
18:
K
k-nearest-neighbor ( x j ,
S
(t) */
S
m , k K
(t) ]]
19:
}
/* number of minority cases within the k -nearest neighbor of x j */
m
←{
[[ k
P
20:
else
(x j μ) T 1 (x j μ)
/* μ and are the mean and the covariance matrix of P
21:
ω =
t */
22:
m ←{ m , 1 }
( m ,I) reverse-sort ( m )
/* sort m in descending order, and put the corresponding indices in I */
23:
24:
I I( 1: n)
/* Use the most similar previous minority class examples to augment S (t)
to achieve
desired class ratio f .*/
L( S
t
+ Q
t (I ))
25:
h t
26:
Q ←{ Q , P
t
}
27:
if d = SERA then
return h t final =
h t for predicting class label of instance x within
(t)
28:
T
29:
else
30:
H
←{
H ,h t }
31:
W
←{}
32:
for i :1
t do
( x j ,y j ) S t ( 1
f y j
i
1
| S t |
( x j )) 2
33:
e i =
34:
W
←{
W , log 1 /e i }
return Composite hypothesis h (t)
final for predicting class label of any instance x j
35:
in testing
data set
T t is H
×
W , i.e.,
h (t)
t
final ( x j )
f i ( x j )
=
argmax
c Y
w i ×
(7.19)
i
=
1
Search WWH ::




Custom Search