Information Technology Reference
In-Depth Information
= (
z ij ) n × n be an intuitionistic fuzzy similarity matrix, where z ij =
Step 1 Let Z
ij ,
v ij )(
,
=
,
,...,
)
are IFVs, then we select one of the elements of Z to
determine the confidence level
i
j
1
2
n
λ 1 , which obeys the following principles:
(1) Rank the membership degrees of r ij (
i
,
j
=
1
,
2
,...,
n
)
in descending order,
and then take
λ 1 = λ 1 ,
v λ 1 ) = i 1 j 1 ,
v i 1 j 1 )
, where
μ i 1 j 1 =
max
i
j { μ ij }
.
,
(2) If there exist two IFVs
i 1 j 1 ,
v i 1 j 1 )
and
i 1 j 1 ,
v i 1 j 1 )
in (1), such that v i 1 j 1 =
v i 1 j 1
(without loss of generality, let v i 1 j 1 <
v i 1 j 1 )
, then we choose the first one as
λ 1 , i.e.,
λ 1 = i 1 j 1 ,
.
Then, for each alternative y i ,welet
v i 1 j 1 )
y i ] ( 1 )
[
={
y j |
z ij = λ 1 }
(2.211)
Z
y i ] ( 1 Z are clustered into one type, and
each of the other alternatives is clustered into one type.
Step 2 Choose the confidence level
In this case, y i and all of the alternatives in
[
λ 2 such that
λ 2 = λ 2 ,
v λ 2 ) = i 2 j 2 ,
v i 2 j 2 )
,
μ i 2 j 2
=
) { μ ij }
with
(in particular, if there exist two or more IFVs whose
membership degrees have the same value
max
(
i
,
j
) = (
i 1
,
j 1
μ i 2 j 2 , then we can follow the policy in (2)
y i ] ( 2 )
of Step 1. Then, we let
[
={
y j |
z ij = λ 2 }
, in this case, y i and all of alternatives in
Z
y i ] ( 2 Z are clustered into one type, and each of the other alternatives is clustered into
one type. Merging
[
y i ] ( 1 )
y i ] ( 2 )
y i ] ( 1 , 2 )
[
and
[
, we get
[
={
y j |
z ij ∈{ λ 1 2 }}
, and thus,
Z
Z
Z
y i ] ( 1 , 2 )
y i and all of the alternatives in
[
are clustered into one type, and the types of
Z
the other alternatives keep unchanged.
Step 3 Take the other confidence levels and do cluster analysis following the
procedure of Step 2 until all the alternatives are clustered into one type.
From the above processes, we can see that the direct method can realize the cluster
analysis just based on the subscripts of alternatives, and there is even no need to get
the
-cutting matrix, which is a notable advantage of the direct method. In practical
applications, after choosing some proper confidence levels, we just need to confirm
their locations in the intuitionistic fuzzy similarity matrix, and then we can get the
types of the considered objects on the basis of their location subscripts.
λ
Example 2.16 (Wang et al. 2012) We use the same example as Example 2.15, and
utilize the direct method developed above to classify the five cars, which involves
the following steps:
Step 1 By Eq. ( 2.208 ), we calculate
6 μ min ( z 1 k z 2 k , z 2 k z 1 k ) ,
v min ( z 1 k z 2 k , z 2 k z 1 k )
Z 1
2
sim
(
y 1 ,
y 2 ) = (
Z 1
) 12 =
min
1
k
and get sim
(
y 1 ,
y 2 ) = (
0
.
7
,
0
.
2
)
.
Then we calculate the others in a similar way. Consequently, we get the intuition-
istic fuzzy similarity matrix:
 
Search WWH ::




Custom Search