Information Technology Reference
In-Depth Information
Differently from (
7.1
) of RSM, we do not always have LA
A
(
X
)
ↆ
X
and
UA
A
(
X
)
ↇ
X
. However, we have
LA
A
(
UA
A
(
)
ↆ
),
X
X
(7.10)
because 1
−
ʲ>ʲ
when
ʲ<
0
.
5. Moreover, we also have
LA
A
(
LA
A
(
X
)
=∅
,
)
∩
X
(7.11)
X
ↆ
X
=∅
for any disjoint subsets
X
5. Because the
inclusion relation of (
7.10
), each of the lower and upper approximations, and the
boundary is represented by the other two sets:
,
U
,
X
∩
, because
ʲ<
0
.
UA
A
(
LA
A
(
BN
A
(
X
)
=
X
)
∪
X
),
LA
A
(
UA
A
(
BN
A
(
X
)
=
X
)
\
X
).
The monotonic property (
7.4
) does not hold either. It causes difficulties of defining
and enumerating reducts in VPRSM.
We can define positive, boundary, and negative regions in the same manner of the
classical RSM:
POS
A
(
A
X
)
=
{
R
A
(
u
)
|
μ
X
(
u
)
≥
1
−
ʲ
}
,
BND
A
(
A
X
)
=
{
R
A
(
u
)
|
μ
X
(
u
)
∈[
ʲ,
1
−
ʲ)
}
,
NEG
A
(
A
)
=
{
R
A
(
)
|
μ
X
(
)>ʲ
}
.
X
u
u
Clearly, we have,
POS
A
(
LA
A
(
X
)
=
X
),
BND
A
(
BN
A
(
X
)
=
X
),
NEG
A
(
UA
A
(
X
)
=
U
\
X
).
D =
In the rest of this section, we consider VPRSM under a decision table
(
,
∪{
}
,
{
V
a
}
)
∈
U
C
d
. For each decision attribute value
i
V
d
, the decision class
X
i
={
∈
|
(
)
=
}
X
=
u
U
d
u
i
. The set of all decision classes are denoted by
{
X
1
,
X
2
,...,
X
p
}
.
Example 7
Consider a decision table
given in Table
7.3
.
The decision table composed of 40 objects with a condition attribute set
C
D =
(
U
,
C
∪{
d
}
,
{
V
a
}
)
=
{
and a decision attribute
d
. Each condition attribute takes a value bad
or good, i.e.,
V
c
i
=
c
1
,
c
2
,
c
3
,
c
4
}
{bad, good} for
i
=
1
,
2
,
3
,
4. The decision attribute takes one
of three values:
V
d
=
{bad, medium, good}. Then there are three decision classes
X
b
,
X
m
and
X
g
whose objects take decision attribute value bad, medium and good,
Search WWH ::
Custom Search