Information Technology Reference
In-Depth Information
explained by referring again to the examples in Fig. 4.1 . Let us consider decision
boundaries formed by a unique line, like line
ʲ
in Fig. 4.1 a. In these cases none of
the features is redundant, however it is apparent that the relevance of a feature can be
stated in terms of the line slope. In order to apply the DBFE method, let us observe
that the decision boundary has the form y
=
mx
+
k , hence the normal vector is
N
. The calculus of equation ( 4.3 ) is straightforward since the normal
vector is constant along S and the equation becomes:
=[
m
,
1
]
N T N S p
m 2
(
x
)
d x
m
N T N
S p
EDBFM =
=
=
.
(4.4)
m 1
(
x
)
d x
m 2
Eigenvalues and related eigenvectors are
ʻ 1
=
0,
ʻ 2
=
+
1, v 1
=[
1
,
m
]
,
v 2
, and only the second eigenvector v 2 is the informative direction. In
this case the eigenvector components define the relevance of the real features. For
instance, when m
=[−
m
,
1
]
0 (boundary parallel to the x -axis) the only informative real
feature is the y -axis, when m
=
=
1 (boundary y
=
x ) both features are equally
important, finally as m
(boundary tends to the y -axis) the relevance of x -axis
grows. As a second case, let us consider the border in Fig. 4.1 b. In this case, cpdfs
are taken constant along the boundary and EDBFM is
ₒ∞
80
02
EDBFM =
,
with
. This case is somewhat complemen-
tary to the former: now, since new features coincide with the real ones, the relevance
of the latter is fully expressed by eigenvalues. From the analysis of these two cases
we can derive that in the DBFE approach the eigenvector components represent the
weight of every real feature locally to the new feature, whereas the eigenvalues rep-
resent the discriminative power of each new feature. Hence we can combine these
two characteristics in order to define a global ranking of the real features as it is in the
objective of the present work. Firstly eigenvectors are weighted by multiplying them
by the respective eigenvalues, and then the corresponding components of weighed
eigenvectors are summed (in the absolute values). Resulting values are the individual
contributions (or weights) of every real feature into the transformation, and represent
the discriminative power of each real feature and its relative position in a rank model .
Formally, let u 1 ,
ʻ 1 =
8
2 =
2, v 1 =[
1
,
0
] ,
v 2 =[
0
,
1
]
u 2 ,...,
u N be the eigenvectors of the EDBFM matrix,
ʻ 1 2 ,
ʻ N the corresponding eigenvalues, and u ij the j th component of the eigenvector
u i . The weights of real features are computed as follows:
…,
N
w j =
1 ʻ i |
u ij | ,
j
=
1
,...,
N
,
(4.5)
i
=
w j >
w k
feature f j is more important than f k .
Search WWH ::




Custom Search