Information Technology Reference
In-Depth Information
Tabl e 3 . 5
Synoptic comparison of perceptrons according to the activation function.
H S
H R 2
H S
H R 2
min P e
min P e
!￿￿￿ min P e
min EE
min EE
min EE
max EE
min P e
min P e
'￿￿￿￿￿￿￿ min P e
min EE
min EE
min EE
max EE
min P e
min P e
min P e
min EE
EE
min EE
a. w.s.c. stands for “well separated classes”, usually with distance between the
means exceeding 1.5 times the standard deviation.
b. p.a. stands for “presumably always”.
3.4 The Hypersphere Neuron
The hypersphere neuron implements the following classifier function family:
Z W = θ ( ϕ (
R .
2
w 0 ); w
d ,r
x w
W
R
(3.70)
The argument of the activation function ϕ (
·
) defines a hyperparaboloid in
2
w 0 )= a a hyperspherical decision
d -dimensional space. Setting ϕ (
x w
border is obtained.
The hypersphere classifier was first used as a building block in a more
sophisticated type of classifier, the so-called compound classifier [18, 19]. In
more recent times, it has attracted the attention of many researchers of the
Pattern Recognition area due to the ease of use and eciency revealed by
networks of hyperspheres [119, 16, 17, 236]. It has also inspired some new
approaches on RBF NNs [58].
The training algorithm for the hypersphere neuron, using gradient descent,
follows exactly the same steps mentioned in Sect. 3.1.2. In the following
experiments we only consider the hypersphere classifier in bivariate space;
the decision border is then a circle.
3.4.1 Motivational Examples
Example 3.10. In this example we use two Gaussian distributed class-condi-
tional PDFs, g ( x ;
μ t , Σ t ),with
μ 1 =[2 0] T ,
μ 1 =[0 0] T ,and Σ 1 = Σ 1 = I .
 
Search WWH ::




Custom Search