Information Technology Reference
In-Depth Information
for point Q . It achieves this by comparing, from an entropic point of view, all
PDFs resulting from adding each connection vector q i to the M -neighborhood
vector field. LEGCLust ranks each q i according to the variation it introduces
into the respective (PDF) entropy. The connection that introduces less dis-
order into the system, that least increases the entropy of the system, will be
top ranked as the ideal or stronger connection, followed by the other M
1
connections in decreasing order.
Let D =
= j .Let H R 2 ( D, q i ) be Rényi's quadra-
tic entropy associated with connection q i , the entropy of the set of all d ij
connections plus connection q i :
{ d i,j }
, i, j =1 , 2 , .., M , i
H R 2 ( D, q i )= H ( D
∪{ q i }
) ,i =1 , 2 , .., M.
(6.51)
This entropy is the dissimilarity measure for the LEGClust algorithm.
Let us apply the estimate expressed by formula (6.49) to (6.51). For
that purpose we first need a simple and uniform notation for the connec-
tion vectors. We do this as follows: d p , p =1 ,...,M ( M
1) will de-
note any of the d ij vectors of the M -neighborhood vector field, by setting
p =( i − 1)( M − 1) + j − ( i<j ) for all i = j ; d 0 denotes any particular
q i connection. Taking the information potential
V R 2
V R 2 ( D ∪{ q i } ) (see
1)] 2 + M ( M
Appendix F), we have with n =[ M ( M
1),
M ( M− 1)
M ( M− 1)
1
(2 π ) d/ 2 n 2
V R 2 =
e 2 d p d q ) 2 / (2 h 2 ) d
.
(6.52)
p =0
q =0
Interpreting the d p (or d q ) as errors, since they represent difference vectors
(deviations) between points, one is then searching the d 0 that minimizes
Rényi's quadratic entropy of the errors. This bears some resemblance with the
MEE approach for supervised classification. Note that any of the previously
mentioned entropic clustering algorithms (Sect. 6.4.2.1) simply apply entropy
to the data itself. LEGClust is so far the only algorithm that applies a MEE-
like concept. The
V R 2
expression can be further processed as
M ( M− 1)
1
(2 π ) d/ 2 n 2
V R 2 =
V M +
e 2 d 0 d q ) 2 / (2 h 2 ) d
,
(6.53)
q =1
where V M is the information potential relative to the M ( M
1) radially sym-
metric distribution of the d ij . V M is constant for a given M -neighborhood
vector field. What really matters is the right term of (6.53) which is propor-
tional to
M ( M− 1)
1
M ( M
f D ( d 0 )=
e 2 d 0 d q ) 2 / (2 h 2 ) d
(6.54)
1)
q =1
 
Search WWH ::




Custom Search