Graphics Reference
In-Depth Information
Algorithm 9
LVF algorithm.
function
LVF(
D
- a data set with
M
features,
U
- the inconsistency rate,
maxTries
- stopping
criterion,
γ
- an allowed inconsistency rate)
initialize:
list
L
={}
L
stores equally good sets
C
best
=
M
for
maxTries
iterations
do
S
= randomSet(seed)
C
=
(
S
)
#
# - the cardinality of
S
if
C
<
C
best
and
CalU(
S
,
D
)
<γ
then
S
best
=
S
C
best
=
C
L
={
S
}
L
is reinitialized
else if
C
=
C
best
and
CalU(
S
,
D
)
<γ
then
L
= append(
S
,
L
)
end if
end for
return
L
all equivalently good subsets found by LVF
end function
LVWreplaces the function
CalU()
in LVF (seeAlgorithm10). Probabilisticmeasures
(category C16) can also be used in LVF.
Algorithm 10
LVW algorithm.
function
LVW(
D
- a data set with
M
features,
LA
- a learning algorithm,
maxTries
- stopping
criterion,
F
- a full set of features)
initialize:
list
L
={}
L
stores sets with equal accuracy
A
best
= estimate(
D
,
F
,
LA
)
for
maxTries
iterations
do
S
= randomSet(seed)
A
= estimate(
D
,
S
,
LA
)
# - the cardinality of
S
if
A
>
A
best
then
S
best
=
S
A
best
=
A
L
={
S
}
L
is reinitialized
A
best
then
L
= append(
S
,
L
)
end if
end for
return
L
else if
A
=
all equivalently good subsets found by LVW
end function
7.4.4 Feature Weighting Methods
This is a variation of FS and it is closely related to some related work described in
Chap.
8
regarding IS, lazy learning [
1
] and similaritymeasures . The
Relief
algorithm