Information Technology Reference
In-Depth Information
the FDT in their applications. Thus, their results showed that this algorithm is suitable
for our approach. There are two important points in making and applying FDT :
Select the best attribute in each node to develop the tree: there are many criteria for
this aim, but we will use one of them.
Inference procedure from FDT. In the classification step for a new sample in FDT,
we may encounter many leaf nodes with deferent confidence that offer some
classes for purposed sample. Thus, the fitness mechanism selection is important
here. Before we express the algorithm, we will consider some assumptions and
notation:
The training examples will be called E set with N example. Each example has N
properties and every property A j contain m j linguistic term and so the number of
output class will be as following.
E
=
{e
,e
,...,e
}
A
=
{A
,A
,...,A
}
for
A
;
j
N
C
=
{c
,c
,...,c
}
1
2
n
1
2
N
j
1
2
k
Fuzzy term
s
for
A
{
V
,
V
,...,
V
}
j
j
1
j
2
jm
j
The set of exist examples in t nodes show by X.
: represent the degree membership of example x belongs to the class c k .
µ
( x
)
c
µ
( x
)
:
represent the degree membership of crisp value for attribute j in example
x belongs to the fuzzy term in j attribute. Also consider four following formulas:
v
jL
k
(1)
P
*
(c
)
=
µ
(
x
)
c
x
X
k
(2)
k
P
*
(v
)
=
µ
(
x
)
v
x
X
L
P
*
(c
v
)
(3)
P
*
(c
/
v
)
=
k
L
k
L
P
*
(c
)
k
(4)
P
*
(c
v
)
=
µ
(
x
)
µ
(
x
)
k
L
c
V
x
X
k
L
Creating a Fuzzy Decision Tree.
Step1: Start with all the training examples, having the original weights (degree mem-
bership of each sample to desired class is considered 1 value), in the root node. In
other words, all training examples are used with their initial weights (This initial
weight is not necessarily 1).
Step2: if in one of the node t with fuzzy set X one of the below condition is true, that
node will consider as a leaf node.
Con1 : for all examples of set X, the proportion for degree membership in a class to
sum of degree membership of all data to different classes is equal or greater than
θ r .
(5)
Con2 : sum of degree membership of all data in set X, less than Threshold
∑ ∑
K
[
µ
(
x
)
/
µ
(
x
)
θ
c
c
r
x
X
k
=
1
x
X
k
k
θ
r .
∑ ∑
K
µ
(
x
)
]
θ
(6)
c
r
k
=
1
x
X
k
Search WWH ::

Custom Search