Digital Signal Processing Reference
In-Depth Information
where
P
is a binary matrix,
S
is the distance matrix obtained from
X
as
2
Sx
=−
x
i
P
=
1
x
and
.
indicates that an edge is existed between points
ij
i
j
x
, otherwise the edge is absent. By enforcing the additional constraints
PP
=
,
an undirected sub-graph which the degree of every node is strictly
b
can be ob-
tained. This advantage plays a key role when conducting label propagation on data
sets which are unevenly and non-uniformly distributed. If relax the constraint
{
ij
ji
[]
}
i
P
∈
0,1
i
P
∈
0,1
to
, this optimal problem can be solved by linear program-
Obn
via loopy belief propagation
(
3
)
ming. In [20], an algorithm in polynomial time
is presented.
Graph Edge Re-weighted:
Once the binary matrix
P
is available, the weight of the
graph edgecould be computed by several approaches, such as Gauss kernel [16], 0-1
way,
l
-norm, locally linear reconstruction weights [10] etc. Based on the definition
of b-matching graph, it can be seen that the new graph is sparser than KNN graph.
Figure 1 is the visualization of the neighborhood relation on a synthetic data set.
Fig. 1.
(a) the synthetic dataset. (b) adjacency matrix constructed by KNN method; (c) adjacen-
cy matrix constructed by b-matching method. Notice: The green point in (a), denoted as point
No.4, has 7 neighbors in KNN graph and 4 neighbors in b-matching graph.
2.2.2
Graph Construction by Sparse Representation
The goal of SR (Sparse Representation) is to represent
xX
using as few elements
of
X
as possible. This can be formally expressed as solving the following optimiza-
tion problem:
∈
min
..
w
xXw
W
0
(7)
st
=
denotes the
l
-norm which counts
the number of nonzero entries in a vector. However, finding the sparsest solution of
Eq. (6) is NP-hard. Fortunately, recent reports [21] reveals that if the solution
wR
is the coefficient vector,
∈
•
where
n
0
w
is