Digital Signal Processing Reference
In-Depth Information
a
b
2
Class 1
Class 2
Class 3
Class 1
Class 2
Class 3
1
0
−5
1
−1
0
−2
0
−1
−3
−5
−5
−5
0
0
0
5
5
5
c
d
4
4
Class 1
Class 2
Class 3
Class 1
Cl a ss 2
C l as s 3
3
2
2
0
1
0
−2
−1
−4
−2
−6
−3
−5
−5
−5
−5
0
0
0
0
5
5
Fig. 6.3 A synthetic example showing the significance of learning a discriminative dictionary in
feature space for classification. (a) Synthetic data which consists of linearly non separable 3D
points on a sphere. Different classes are represented by different colors. (b) Sparse coefficients
from K-SVD projected onto learned SVM hyperplanes. (c) Sparse coefficients from a non-
linear dictionary projected onto learned SVM hyperplanes. (d) Sparse coefficients from non-linear
discriminative kernel dictionary projected onto learned SVM hyperplanes [132]
Figure 6.3 presents an important comparison in terms of the discriminative power
of learning a discriminative dictionary in the feature space where kernel LDA type
of discriminative term has been included in the objective function. A scatter plot
of the sparse coefficients obtained using different approaches show that such a
discriminative dictionary is able to learn the underlying non-linear sparsity of data
as well as it provides more discriminative representation. See [132], [96], [95] for
more details on the design of non-linear kernel dictionaries.
 
Search WWH ::




Custom Search