Information Technology Reference
In-Depth Information
(2)
C
=
C
+
C
.
sum
avg
1
avg
2
by eigenvalue decomposition, V is the ei-
genvectors matrix and D is the eigenvalues matrix.
Denote the whitening P matrix as
T
sum
Then C sum can be factored as
V
D
V
sum
sum
(3)
P
=
(
D
1
)
V
T
sum
.
sum
Applying P to Eq. 2, in order to make the left side of the equation equal to identity
matrix I
(4)
PC
P
T
=
PC
P
T
+
PC
P
T
.
sum
avg
avg
1
2
T
T
T
PC
avg P
and
PC
avg P
can be factored by eigenvalue decomposition as
QE 1
Q
,
1
2
and
QE 2 respectively. We arrange these matrices so that the diagonal elements of
E 1 and E 2 are sorted in descending and ascending order, respectively.
Q
T
(5)
T
T
I
=
QE
Q
+
QE
Q
.
1
2
Finally we obtain N × N CSP projection matrix W as
(6)
W
=
Q
T
P
.
We called each row of W a spatial filter , and each column of W -1 a spatial pattern .
Typically, only some spatial filters are selected. For a certain value m , we reserved the
first m and the last m rows of W (2 m rows altogether) and removed the remaining
middle N - 2 m rows. The remaining rows form W r matrix with dimension 2 m × N . We
can apply this matrix to any sample matrix X raw (similar to X but without centering):
Z
=
W
X
(7)
.
r
raw
A feature vector f = [ f 1 , …, f 2 m ] T can be defined based on log of variance ratio:
()
()
var
Z
f
=
log
i
,
(8)
i
2
m
=
var
Z
j
j
1
where var( Z j ) is the variance of samples in row j . This feature vector is ready to serve
as an input for any classifier. In this paper, we use simple Linear Discriminant Analy-
sis (LDA) as the classifier in evaluation process. However, other classifiers can be
used.
2.2 Our Analysis on CSP
Let us observe the derivation of CSP projection matrix. It is based on the average of
covariance matrices of trials of the same class. If there is only one trial for each class,
Search WWH ::




Custom Search