Information Technology Reference
In-Depth Information
17. For j =1 to Interval do
18. {using algorithm AI( i ) to generate a classifier in example set EI( j )
19. and generate a data term according to definition 7.19
20. store data into CB; }
21. Else if op=4 // deleting an old register algorithm
22 . thenDelete related item of RA from CB
23. Else if op=5 // deleting an old register example set
24. then Delete related item of ES from CB;
25. Else return failed;
26. Modify Interval according to the size of CEB
7.8.8 Improved decision tree generating algorithm GSD
c
b
We use cost coefficient
for constructing attribute
selecting function ASF, and set their default value. Following gives constructing
method of attribute selecting function ASF.
C
and bias coefficient
C
0
0
1
0
0
0
1
0
Definition 7.28 E
={e
,…,e
v } is a set of primitive training examples, A
={A
,…, A
v }
0
is attribute set of E
, where υ is maximum number of primitive training examples,
ϖ is maximum number of attribute of E
0
; A i ={V 1 ,…, V v } represents v different
values of attribute i.
T
X
X
Definition 7.29
= {A 1 ,…, A m ,
A m+1 } which are training example set and attribute set through preprocessing
under introducing background knowledge and algorithm CGAOI, where
1 n υ , 2 m ϖ .
E = {e 1 ,…, e n } A
= A A
= {A 1 ,…, A m } A
X
T
1
T
n ] T is propagate frequency getting from
Definition 7.30
A
= C
= [ C
,…, C
algorithm CGAOI.
Definition 7.31
C={C 1 ,…,C k } are k possible classes in primitive training
example set E. Pi represents the probability of class C i in E.
Definition 7.32
In procedure of generating decision tree, each attribute oriented
test must make the value of attribute selecting function ASF maximum.
Definition 7.33
ASF function is defined as follow:
f
(
A
)
g A
(
)
(
A
)
=
i
i
ASF
i
h A
(
)
i
 
Search WWH ::




Custom Search