Database Reference
In-Depth Information
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
0
0.1
0.2
0.3
0.4
0.5
Audio Score
0.6
0.7
0.8
0.9
1.0
Fig. 10.6
A two-dimensional plot of data samples obtained from the database to be classified by
SVM for a given query. According to the ground truth, the positive samples are marked as '
plus
'
and negative samples are marked as '
circle
'
The way to solve this problem is through the Lagrangian dual. In practice, however,
a separate hyperplane may not exist e.g., if a high noise level causes a large overlap
of the classes. Thus, we employ a
soft margin
classifier, called
C
-support vector
classifier (SVC) [
305
] for implementation in the current work. The software library
for this implementation may be found in [
332
]. The
C
-SVC uses the constant
C
0 as the upper bound which is the only difference from the separable case
[cf. Eq. (
10.34
)]. The technique here is to minimize the objective function,
>
m
i
=
1
ʾ
i
1
2
2
˄
(
w
,
ʾ
)=
w
+
C
(10.36)
subject to
y
i
·
((
w
·
x
i
)+
b
)
≥
1
−
ʾ
i
,
ʾ
i
≥
0
,
i
=
1
,...,
m
(10.37)
where
ʾ
i
are slack variables. Incorporating kernels, and rewriting it in terms of
Lagrange multipliers, this leads to the problem of maximizing:
m
i
=
1
ʱ
i
−
m
∑
1
2
maximize
ʱ
∈
R
1
ʱ
i
ʱ
j
y
i
y
j
k
(
x
i
,
x
j
)
(10.38)
m
i
,
j
=
m
i
=
1
ʱ
i
y
i
=
0
subject to 0
≤
ʱ
i
≤
C
,
i
=
1
,...,
m
,
and
(10.39)
Search WWH ::
Custom Search