Information Technology Reference
In-Depth Information
Note that values of units are arbitrary but they should range from 0 to 1
(sometimes -1 to 1 range). In general, every unit has following aspects:
A set of inputs connects to it. Each connection is defined by a weight
Its weighted sum is computed by summing up all the inputs modified by their
respective weights
A bias value is added to weighted sum. This weighted sum is also called
activation value.
Its output is the outcome of activation function on weighted sum. Activation
function is crucial factor in neural network.
Activation function is the squashing function which “squashes” a large weighted
sum into possible smaller values ranging from 0 to 1 (sometimes -1 to 1 range).
There are three types of activation function:
Threshold function takes on value 0 if weighted sum is less than 0 and
0
if
x
<
0
otherwise. So
μ
(
x
)
=
1
if
x
>
0
Piecewise-linear function takes on values according to amplification factor in a
1
0
if
x
2
1
1
certain region of linear operation. So
μ
(
x
)
=
x
if
x
=
<
x
<
2
2
1
1
if
x
2
Sigmoid function takes on values in range [0, 1] or [-1, 1]. The formula of
1
sigmoid function which is
μ
(
x
)
= 1
.
+
e
x
Fig. 8 Sigmoid function
There are two topologies of neural network:
Feed-forward neural network . It is directed acyclic graphic in which flow of
signal from input units to output units is one-way flow so-called feed-forward.
There are no feedback connections
 
Search WWH ::




Custom Search