Digital Signal Processing Reference
In-Depth Information
l minimization problem can be bypassed
sparse enough, the solution of the above
l minimization problem:
by convexizing the following
min
..
w
xXw
W
1
(8)
st
=
xX , a
Eq. (7) can be solved by standard linear programming. Hence, for each
i
[
]
T
w
=
wwww
,...,
, 0,
,...,
sparse reconstructive weight vector
can be
i
i
1
ii
1
ii
+
1
in
[
]
T
Www w can be com-
posed by the solutions of n standard linear programming problems. The weight
matrix W reflects some intrinsic geometric properties of the data. Figure 2(c) shows
affinity matrix of the partial ORL face dataset constructed by sparse representation
technique. It can be found most of the non-zero adjacency weights link the samples
from the same class. Besides, affinity matrixes constructed by KNN and b-matching
are also illustrated as Figure 2(a) and Figure 2(b). Here, graph edge is re-weighted by
Gauss Kernel. The scale size
,
,...,
found. Therefore, a weighted affinity graph
1
2
n
σ
in Gauss Kernel is manually set as the medium val-
ue of connected pairwise samples' distances, and
bk
==
3
. Clearly, compare with
KNN, both b-matching and sparse representation can construct better affinity graphs.
Fig. 2. Affinity matrix constructed by KNN, b-matching and l -minimization
3
The Algorithm
The algorithmic procedure of the improved KL clustering is summarized as follows.
XR ,
(a) Constructed weighted matrix W graph using b-matching method, name-
ly using Eq. (6) and graph edge re-weighted methods;
(b) Constructed weighted matrix W graph using sparse representation me-
thod, namely using Eq. (8) ;
STEP 1: Suppose a data set
dn
×
Search WWH ::




Custom Search