Biomedical Engineering Reference
In-Depth Information
Algorithm 10 Laplacian-based n-dimensional embedding
1: Set d X , σ and n the dimension of the embedding.
2: Compute K with K ( i, j )= e
X ( x i ,x j ) 2
σ 2
d
.
3: Compute D K with D K ( i, i )= j =1 K ( i, j ) and D K ( i, j )=0 if i = j .
4: Compute W = D 1 / 2
K
KD 1 / 2
K
(Eq. 7.6 ).
5: Compute D with D ( i, i )= j =1 W ( i, j ) and D ( i, j )=0 if i = j .
6: Find the n +1 first generalized eigenvectors f k
solution of ( D − W ) f k = λ k Df k , k =
0 ,...,n .
7: The coordinates of point x i in R n are ( f 1 ( i ) ,f 2 ( i ) ,...,f n ( i )) .
The embedding function f can therefore be obtained by computing the eigenvec-
tors f q of L rw , which satisfy:
D 1 W
(
I
)
f q =
λ q f q .
Eigenvectors of L rw are generalized eigenvectors of L since
Lf q =
λ q Df q .
It can easily be proved that
is a trivial eigenvalue of L , and also that L is
a symmetric positive matrix, which implies, since d
0
(
x i ) 0
for all i , that the
generalized eigenvalues λ q are positive and can be ordered:
0=
λ 0
λ 1 ≤ ··· ≤
n is then:
λ q
λ q +1 ≤···≤
λ N− 1 . The embedding function f into
R
f
(
x i )=(
f 1 (
i
)
,...,f n (
i
))
,
(7.7)
where f q (
is the i th component of f q .
Algorithm 10 presents the different steps of the approach. Note that, since L
is symmetric, its first eigenvectors can be computed efficiently with an iterative
method. It is possible to set to
i
)
the w ij below a threshold, leading to sparse
matrices, and reducing the computational cost of matrix-vector multiplications at
each iteration.
Manifold learning methods are “data driven”. They capture the structure of the
dataset, provided that the chosen distance d X is appropriate. When dealing with time
series, many distance functions can be used. In practice, d X does not need to be a
distance, in the mathematical sense: instead it can measure the difference between
features of interest for two elements of the dataset. One may even design d X
0
to be
blind to features of the data that are irrelevant for the application at hand.
7.2.3
Application to the Reordering of EEG Times Series
The purpose of the previous section was to compute an embedding function, which
now allows the parameterization of the dataset according to the first embedding
 
Search WWH ::




Custom Search