Information Technology Reference
In-Depth Information
affect the output sign of the classifier but only its magnitude. For this purpose, we
propose the use of a new vector
ʲ
which is defined as:
2
k
−
1
ʲ
i
=
ʱ
i
,
(5.1)
C
where
k
is the number of bits. Moreover, by omitting the equality constraint of the
dual SVM formulation (
y
T
without a bias term
b
. This is clearly advantageous when we deal with a fixed-point
arithmetic formulation as this value can be difficult to control and instead the FFP
can be easily computed as:
ʱ
=
n
f
(
x
)
=
y
i
ʱ
i
K
(
x
i
,
x
)
(5.2)
i
=
1
This modification has no influence on the classification performance of the trained
model as far as a Radial Basis Function (RBF) kernel, such as the Gaussian or the
Laplacian ones, is exploited (Poggio et al.
2002
). These two modifications yield the
following formulation:
2
k
1
2
ʲ
−
1
T
Q
ʲ
−
s
T
min
ʲ
ʲ
s.t. 0
≤
ʲ
i
≤
∀
i
∈ {
1
, ...,
n
}
,
(5.3)
C
=
2
k
1
/
where
s
i
. Once the problem expressed in Eq. (
5.3
)
is solved,
ʲ
can straightforwardly target fixed-point arithmetic through a simple
nearest-integer normalization (Anguita et al.
2007
).
To finally have a full FFP with only integer values, it is needed to modify the
representation of the kernel
K
−
C
∀
i
∈ {
1
, ...,
n
}
(
·
,
·
)
and the input vector
x
in terms of number of bits
(
u
and
v
bits respectively) (Anguita et al.
2007
). This produces:
2
−
u
0
≤
K
(
x
i
,
x
)
≤
1
−
∀
i
∈ {
1
, ...,
n
}
,
(5.4)
2
−
v
0
≤
x
j
≤
1
−
∀
j
∈ {
1
, ...,
d
}
.
(5.5)
Consequently the modified Fixed-Point FFP formulation vector is:
n
f
(
x
)
=
y
i
ʲ
i
K
(
x
i
,
x
)
(5.6)
i
=
1
In particular, we opted for a Laplacian kernel (
K
x
i
,
x
j
=
1
), instead
of the more conventional Gaussian kernel, as it is more convenient for hardware lim-
ited devices (Anguita et al.
2007
) because it can be easily computed using shifters.
The Manhattan norm is defined as
2
−
ʳ
x
i
−
x
j
j
=
1
x
j
and
x
1
=
ʳ >
0 is the kernel
hyperparameter which can be selected (altogether with the regularization hyperpa-
rameter
C
), for example, through a
k
-Fold Cross Validation (KCV) procedure (with
Search WWH ::
Custom Search