Information Technology Reference
In-Depth Information
I ij = ˜
Λ s j
˜
˜
Λ s i ,
= Λ
Λ
¯
¯
Λ , Λ
s i
s j
= Λ s i , Λ s j
l = 1 Λ s i , Λ s l
N
l = 1 Λ s l , Λ s j +
N
N
l = 1
n = 1 Λ s l , Λ s n .
(1.34)
N
1
N
1
N
1
N 2
In matrix notation,
1
N ( 1 N I + I1 N )+
1
N 2 1 N I1 N ,
˜
I = I
(1.35)
I ij = Λ s i , Λ s j , and
where
I
is the Gram matrix of the inner product of spike trains
N matrix with all ones. This means that ˜
1 N is the N
×
I
can be computed directly in
terms of
I
without the need to explicitly remove the mean of the transformed spike
trains.
From Equation (1.33), finding the principal components simplifies to the prob-
lem of estimating the coefficients
is a quadratic
function its extrema can be found by equating the gradient to zero. Taking the
derivative
{
b i }
that maximize J
( ξ )
. Since J
( ξ )
with
regard
to
b
(which
characterizes
ξ
)
and
setting
it
to
0
results in
J
( ξ )
b
2 ˜
2
˜
=
I
b
2
ρ
Ib =
0
(1.36)
and thus corresponds to the eigendecomposition problem 3
˜
Ib = ρ b .
(1.37)
This means that any eigenvector of the centered Gram matrix is a solution of Equa-
tion (1.36). Thus, the eigenvectors determine the coefficients of Equation (1.32)
and characterize the principal components. It is easy to verify that, as expected, the
variance of the projections onto each principal component equals the corresponding
eigenvalue. So, the ordering of
ρ
specifies the relevance of the principal compo-
nents.
To compute the projection of a given input spike train s onto the k th prin-
cipal component (corresponding to the eigenvector with the k th largest eigen-
value) we need only to compute in the RKHS the inner product of
Λ s with
ξ k .
That is,
3 Note that the simplification in the eigendecomposition problem is valid regardless if the Gram
matrix is invertible or not, since ˜
2 and ˜
I have the same eigenvectors and the eigenvalues of ˜
2 are
I
I
the eigenvalues of ˜
I squared.
 
Search WWH ::




Custom Search