Information Technology Reference
In-Depth Information
7.2.1 Classification
As mentioned above, the classi
cation step in a BCI aims at translating the features
into commands (McFarland et al. 2006 ; Mason and Birch 2003 ). To do so, one can
use either regression algorithms (McFarland and Wolpaw 2005 ; Duda et al. 2001 )
or classi
cation
algorithms being by far the most used in the BCI community (Bashashati et al.
2007 ; Lotte et al. 2007 ). As such, in this chapter, we focus only on classi
cation algorithms (Penny et al. 2000 ; Lotte et al. 2007 ), the classi
cation
algorithms. Classi
ers are able to learn how to identify the class of a feature vector,
thanks to training sets, i.e., labeled feature vectors extracted from the training EEG
examples.
Typically, in order to learn which kind of feature vector correspond to which
class (or mental state), classi
ers try either to model which area of the feature space
is covered by the training feature vectors from each class
in this case, the classi
er
is a generative classi
or they try to model the boundary between the areas
covered by the training feature vectors of each class
er
in which case the classi
er is
a discriminant classi
er. For BCI, the most used classi
ers so far are discriminant
classi
ers, and notably linear discriminant analysis (LDA) classi
ers.
'
s LDA) was to use hyperplanes to
separate the training feature vectors representing the different classes (Duda et al.
2001 ; Fukunaga 1990 ). The location and orientation of this hyperplane are deter-
mined from training data. Then, for a two-class problem, the class of an unseen (a.k.
a., test) feature vector depends on which side of the hyperplane the feature vector is
(see Fig. 7.2 ). LDA has very low computational requirements which makes it
suitable for online BCI system. Moreover, this classi
The aim of LDA (also known as Fisher
er is simple which makes it
naturally good at generalizing to unseen data, hence generally providing good
results in practice (Lotte et al. 2007 ). LDA is probably the most used classi
er for
BCI design.
Another very popular classi
er for BCI is the support vector machine (SVM)
(Bennett and Campbell 2000 ). An SVM also uses a discriminant hyperplane to
identify classes (Burges 1998 ). However, with SVM, the selected hyperplane is the
Fig. 7.2 Discriminating two types of motor imagery with a linear hyperplane using a linear
discriminant analysis (LDA) classi er
 
Search WWH ::




Custom Search