Information Technology Reference
In-Depth Information
reproducing kernel Hilbert spaces (RKHS) of functions on spike trains, which pro-
vide the needed mathematical structure to easily define and optimize criteria for a
diverse range of problems. Another advantage of this approach is that many of the
difficulties found in manipulating spike trains which lead to the use of binning are
implicitly taken care of through the mapping to the RKHS. In this chapter we ex-
emplify the construction of an RKHS by defining an inner product of spike trains
called memoryless cross-intensity (mCI) kernel . This spike train kernel defines the
RKHS bottom-up as an inner product of intensity functions and thus incorporates
a statistical description of the spike trains. As will be showed later, this particular
kernel is related to the generalized cross-correlation (GCC) [18] but provides a more
principled and broader perspective on many spike train methods reported in the
literature.
For continuous and discrete random processes, RKHS theory has already been
proven essential in a number of applications, such as statistical signal processing
[20, 23] and detection [9, 11, 10], as well as statistical learning theory [29, 35, 38].
Indeed, Parzen showed that several statistical signal processing algorithms can be
stated as optimization problems in the RKHS and easily solved [20, 23]. For in-
stance, the cross-correlation function used throughout statistical analysis and signal
processing, including the celebrated Wiener filter [8], is a valid kernel and induces
an RKHS space [20]. Although frequently overlooked, RKHS theory plays a pivotal
role in kernel methods [29, 35] because it is the reason for the famed kernel trick
which allows for the otherwise seemingly intractable task of deriving and applying
kernel techniques.
In the following, we introduce how to define spike train kernels and present some
examples. A systematic approach which builds the RKHS from the ground up is fol-
lowed by defining inner products for spike trains. The main advantage in this path is
a general and mathematically precise methodology which, nevertheless, can easily
be interpreted intuitively by analyzing the definition of the inner product or, con-
versely, defining the inner product to match our understanding of a given problem.
In this study we present the mCI kernel as an example, since it incorporates a statis-
tical description of the spike trains and the statistical model is clearly stated, but the
ideas can be easily extended. A number of properties are proved for the mCI kernel,
and the relationships between the RKHS and the congruent spaces are discussed
for additional insight. The issue of estimation from data is also addressed. Finally,
the usefulness of an RKHS framework for optimization is demonstrated through the
derivation of an algorithm for principal component analysis (PCA) of spike trains.
1.2 Some Background on RKHS Theory
In this section, some basic concepts of kernel methods and RKHS theorem nec-
essary for the understanding of the next sections are reviewed. The notation was
purposely chosen to be different from the one used later since the presentation here
is meant to be as general and introductory as possible.
Search WWH ::




Custom Search