Information Technology Reference
In-Depth Information
Fig. 1.2
Decision boundaries established by different classifiers
about the class-conditional densities. Figure
1.3
shows a summary of the statistical
classifiers that can be found in the literature [
8
] 0, highlighting the types of
classifiers that are proposed in this work. Statistical classifiers can be summarized
in three categories: (i) based on the concept of similarity—defining an appropriate
distance metric; (ii) based on the probabilistic approach; the optimal Bayes
decision rule (with 0/1 loss function) assigns a pattern to the class with the
maximum posterior probability; and (iii) based on the construction of decision
boundaries (geometric approach) directly by optimizing certain error criterion.
There are a number of decision rules available to define the decision bound-
aries, for instance, Bayes decision, maximum likelihood, and Neyman-Pearson.
The decision rule that attempts to implement most of the statistical classifiers,
including the ones proposed in this work, is the Bayes decision rule. The ''opti-
mal'' Bayes decision rule is stated to minimize the conditional risk R
ð
C
i
j
x
Þ
of
assigning input data x to class C
i
:
Thus,
R
ð
C
i
j
x
Þ¼
X
K
PC
j
j
x
LC
i
;
C
j
ð
1
:
1
Þ
j
¼
1
is the loss incurred in deciding C
i
when the true class is C
j
and
PC
j
j
x
is the posterior probability 0 [
36
]. Assuming a 0
=
1 loss function, i.e.,
L
¼
0
;
i
¼
j and L
¼
1
;
i
6¼
j
;
the conditional risk becomes the conditional
probability of misclassification and thus the objective is to minimize the proba-
bility of classification error. In this case, the Bayes decision rule is called the
maximum a posteriori (MAP) rule and can be defined as follows: assign input data
x to class C
i
if
where LC
i
;
C
j
Þ
[ PC
j
j
x
for all j
6¼
i
PC
i
j
x
ð
ð
1
:
2
Þ
Search WWH ::
Custom Search