Information Technology Reference
In-Depth Information
Fig. 2.5
Sum-of-margin
strategy
m (i)
n
K
1
ξ (i)
1
min 1
ξ (i)
j,k
2
2
w
+
λ
j,k +
+
i
=
1
j
=
1
k
=
1
w T x (i)
j
ξ (i)
if y (i)
j
s.t.
b k ≤−
1
+
j,k ,
=
k,
w T x (i)
j
ξ (i)
j,k
if y (i)
j
b k
=
+
1
1 ,
k
1 ,
(2.12)
+
ξ (i)
ξ (i)
j,k
j,k
1
0 ,
0 ,
+
1 ,...,m (i) ,i
j
=
=
1 ,...,n,
k
=
1 ,...,K
1 .
The second strategy is called the sum-of-margins strategy . In this strategy, some
additional thresholds a k are introduced, such that for category k , b k 1 is its lower-
bound threshold and a k is its upper-bound threshold. Accordingly, the constraints
become that for documents in category k , w T x (i j should exceed threshold b k 1 but
be smaller than threshold a k , with certain soft margins (i.e., 1
ξ (i)
j,k
ξ (i)
j,k )
respectively. The corresponding learning process can be expressed as follows, from
which we can see that the margin term k = 1 (a k
1 and 1
b k ) really has the meaning of
“margin” (in Fig. 2.5 , (b k
+
a k ) is exactly the margin between category k
1 and
category k ).
m (i)
K
1
n
K
1
ξ (i)
1
ξ (i)
j,k
min
(a k
b k )
+
λ
j,k +
+
k =
1
i =
1
j =
1
k =
1
s.t.
a k
b k
a k + 1 ,
w T x (i)
j
ξ (i)
if y (i)
j
a k +
j,k ,
=
k,
(2.13)
w T x (i)
j
ξ (i)
j,k
if y (i)
j
b k
1 ,
=
k
+
1 ,
+
ξ (i)
ξ (i)
j,k
2
w
1 ,
j,k
0 ,
1
0 ,
+
1 ,...,m (i) ,i
j
=
=
1 ,...,n,
k
=
1 ,...,K
1 .
Ideally in the above methods, one requires b k (k
=
1 ,...,K
1 ) to be in an
increasing order (i.e., b k 1
b k ). However, in practice, since there are no clear con-
straints on the thresholds in the optimization problem, the learning process cannot
always guarantee this. To tackle the problem, Chu and Keerthi [ 3 ] propose adding
explicit or implicit constraints on the thresholds. The explicit constraint simply takes
 
Search WWH ::




Custom Search