Information Technology Reference
In-Depth Information
From these orders, all objects that are not included in both orders are eliminated. The
generated orders become:
O 1 = x 3
O 2 = x 4
x 4
x 6 ,
x 3
x 6 .
The ranks of objects in these orders are:
r ( O 1 ,x 3 )=1 ,r ( O 1 ,x 4 )=2 ,r ( O 1 ,x 6 )=3;
r ( O 2 ,x 3 )=2 ,r ( O 2 ,x 4 )=1 ,r ( O 2 ,x 6 )=3 .
Consequently, the Spearman's ρ becomes
6 (1
3) 2
2) 2 +(2
1) 2 +(3
ρ =1
=0 . 5 .
3 3
3
If no common objects exists between the two orders, ρ =0(i.e., no correlation).
15.3.2
k -o'means-TMSE (Thurstone Minimum Square Error)
In [8], we proposed a k -o'means algorithm as a clustering method designed to process
orders. To differentiate our new algorithm described in detail later, we call it by a k -
o'means-TMSE algorithm.
A k -o'means-TMSE in Figure 15.1 is similar to the well-known k -means algorithm
[11]. Specifically, an initial cluster is refined by the iterative process of estimating
new cluster centers and the re-assigning of samples. This process is repeated until no
changes in the cluster assignment is detected or the pre-defined iteration time is reached.
However, different notions of dissimilarity and cluster centers have been used to handle
orders. For the dissimilarity d ( O k ,O i ), equation (15.4) was used in step 4. As a cluster
center in step 3, we used the following notion of a central order [4]. Given a set of
Algorithm k -o'means( S , K , maxIter )
S = {O 1 ,...,O N } :asetoforders
K : the number of clusters
maxIter : the limit of iteration times
1) S is randomly partitioned into a set of clusters: π = {C 1 ,...,C K } ,
π := π , t := 0.
2) t := t +1, if t>maxIter goto step 6.
3) for each cluster C k ∈ π ,
derive the corresponding central order O k .
4) for each order O i in S ,
assign it to the cluster: arg min C k d ( O k ,O i ).
5) if π = π then goto step 6; else π := π , goto step 2.
6) output π .
Fig. 15.1. k -o'means algorithm
 
Search WWH ::




Custom Search