Information Technology Reference
In-Depth Information
Fig. 1.50.
Output of a neuron with 3 inputs
{x
0
=1
,x
1
,x
2
}
with weights
{w
0
=
0
,w
1
=+1
,w
2
=
−
1
}
, whose activation function is a tanh function:
y
= tanh(
x
1
−
x
2
)
Two variants of that type of neuron are
•
high-order neural networks, whose potential is not an a
ne function of the
inputs, but a polynomial function; they are the ancestors of the support
vector machines (or SVM) used essentially for classification, described in
Chap. 6;
•
MacCulloch-Pitts neurons, or perceptrons, which are the ancestors of
present-day neurons; Chap. 6 describes in detail their use for discrimi-
nation.
1.6.1.2 Neurons with Parameterized Nonlinearities
The parameters of those neurons are assigned to their nonlinearity: they are
present in function
f
. Thus, the latter may be a “radial basis function” (RBF)
or a wavelet.
Example
Gaussian radial basis function,
y
=exp
n
w
i
)
2
2
w
2
.
(
x
i
−
n
+1
i
=1
The parameters
are the coordinates of the center of the
Gaussian in input space; parameter
w
n
+1
is its standard deviation. Fig-
ure 1.51 shows an isotropic Gaussian RBF, centered at the origin, with stan-
{
w
i
,i
=1to
n
}
dard deviation 1
/
√
2.
Search WWH ::
Custom Search