Information Technology Reference
In-Depth Information
5.5.4 Kernel Modification Methods
Several techniques have been proposed in the literature to make the SVM algo-
rithm less sensitive to the class imbalance by modifying the associated kernel
function.
5.5.4.1 Class Boundary Alignment Wu and Chang [9] have proposed a variant
of SVM learning method, where the kernel function is conformally transformed
to enlarge the margin around the class boundary region in the transformed higher
dimensional feature space in order to have improved performance. Wu and Chang
[29] have improved this method for imbalanced datasets by enlarging more of the
class boundary around the minority class compared to the class boundary around
the majority class. This method is called the class boundary alignment (CBA)
method, which can only be used with the vector space representation of input
data. Wu and Chang [30] have further proposed a variant of the CBA method
for the sequence representation of imbalanced input data by modifying the kernel
matrix to have a similar effect, which is called the kernel boundary alignment
(KBA) method.
5.5.4.2 Kernel Target Alignment In the context of SVM learning, a quantita-
tive measure of agreement between the kernel function used and the learning task
is important from both the theoretical and practical points of view. Kernel target
alignment method has been proposed as a method for measuring the agreement
between a kernel being used and the classification task in [31]. This method has
been improved for imbalanced datasets learning in [32].
5.5.4.3 Margin Calibration The DEC method described previously modifies
the SVM objective function by assigning a higher misclassification cost to the
positive examples than the negative examples to change the penalty term. Yang
et al. [33] have extended this method to modify the SVM objective function not
only in terms of the penalty term, but also in terms of the margin to recover
the biased decision boundary. As proposed in this method, the modification first
adopts an inversed proportional regularized penalty to reweight the imbalanced
classes. Then it employs a margin compensation to lead the margin to be lopsided,
which enables the decision boundary drift.
5.5.4.4 Other Kernel-Modification Methods There have been several imbal-
ance learning techniques proposed in the literature for other kernel-based classi-
fiers. These methods include the kernel classifier construction algorithm proposed
in [34], based on orthogonal forward selection (OFS) and regularized orthogonal
weighted least squares (ROWLSs) estimator, kernel neural gas (KNG) algorithm
for imbalanced clustering [35], the P2PKNNC algorithm based on the k-nearest
neighbors classifier and the P2P communication paradigm [36], Adaboost rele-
vance vector machine (RVM) [37], among others.
Search WWH ::




Custom Search