Databases Reference
In-Depth Information
class A
class B
class A
class B
Y
Y 2
X
X 2
(b)
(a)
Figure 4.13 Non-Linear Dividing Line
the kernel functions. Typical functions implemented are: linear, polynomial,
radial basis, and sigmoid. By default, most SVM algorithms use the radial
basis function. Finding the function that performs best is, for the most part, a
trial and error process.
Moving beyond two-dimensional predictors
In the real world, most classification models will be built using more than
two predictors. To facilitate visual understanding, the dataset plots of Fig-
ures 4.10-4.13 contained just two predictors. However, moving beyond two
input dimensions using SVMs is not a problem. In three dimensions, instead of a
dividing line, a dividing plane is located. In four or more dimensions, it becomes
a hyperplane. Even though it is not possible to visually represent a hyperplane,
the mechanics of the algorithm are still the same.
SVM Advantages and limitations
Like the ANN, SVM classifiers can be made to fit almost any dataset, depending
on the chosen kernel function and the inputs selected. This is a two-edged
sword. A good fit is important, yet like the ANN, SVM classifiers frequently
overfit the data.
The best way to avoid this problem is to build multiple SVM models using
varying combinations of kernel functions and cost parameter settings, then
comparing results against a validation dataset and choosing the combination
with the best fit to the validation set.
 
Search WWH ::




Custom Search