Graphics Reference
In-Depth Information
MultiBoost with LDA Classifier
MultiBoost with LDA classifier incorporates linear discriminant analysis (LDA) algorithm
to implement linear combinations between selected features and generate new combined
features. The combined features are used along with the original features in boosting algorithm
for improving classification performance. Given a binary classification problem with linear
classifiers that are specified by discriminant functions, LDA assumes the covariance matrices
of both classes to be equal, . We denote the means by
μ 1 and
μ 2 , and arbitrary feature vector
by x define
[ b ; w ] T
=
.
D ( x )
[1; x ]
= 1
.
μ 2 μ 1 )
w
(
=−
w T
b
1
2 .
μ =
(
μ 1 + μ 2 )
.
D ( x ) is the difference in the distance of the feature vector x to the separating hyperplane
described by its normal vector w and the bias b .If D ( x ) is greater than 0, the observation x
is classified as class 2 and otherwise as class 1.
MultiBoost with NB Classifier
The Naive Bayes classifier estimates the posterior probability that an instance belongs to
a class, given the observed attribute values for the instance. It builds a simple conditional
independence classifier. Formally, the probability of a class label value y for an unlabeled
instance x containing n attributes
A 1 ,
A 2 ,...,
A n
is given by
|
=
P ( y
x )
P ( y )
P ( x )
=
P ( x
|
y )
.
P ( A 1 ,
A 2 ,...,
A n )
.
P ( y )
= j + 1 nP ( A j |
y )
.
P ( y )
.
The preceding probability is computed for each class and the prediction is made for the class
with the largest posterior probability. The probabilities in the aforementioned formulas must
be estimated from the training set.
MultiBoost with NN Classifier
The Nearest Neighbor pattern classifier has shown to be a powerful tool for multi-class
classification. The basic idea of the NN classifier is that whenever we have a new instance to
classify, we find its K nearest neighbors from the training data. Given a query instance x i to
be classified
let x 1 ,
x 2 ,...,
x k denote the k instances from training examples that are nearest to x q ;
return the class that represents the maximum of the k instances.
 
Search WWH ::




Custom Search