Information Technology Reference
In-Depth Information
4 Arti
cial Neural Networks
Arti
cial Neural Networks (ANNs) are mathematical representations inspired by
the functioning of the human brain (Bishop 1995 ; Pao 1989 ; Jianjun et al. 2006 ). In
some frequent cases the linear approximation is not valid and the accuracy of
system modeling decreases signi
cantly. Thus ANNs are capable of modeling very
complex functions. In addition they keep in check the curse of dimensionality
problem that bedevils efforts to model nonlinear functions with large numbers of
variables.
4.1 Multi-layer Perceptron Networks
The multilayer neural network is typically composed of an input layer, one or more
hidden layers, and an output layer, each consisting of several neurons. Each neuron
processes its input and generates one output value which is transmitted to the
neurons in the subsequent layer. All neurons and layers are arranged in a feed
forward manner, and no feedback connections are allowed. Training process and
computation in layers and neurons happen by the following equation (Kathryn et al.
2006 ):
h
i
;
W k 1
ð
Þ
y k 1
ð
Þ
k
i
y ðÞ
p
sgm ðÞ
p
¼
:
b
ð
p
¼
1
;
2
; ...;
N k ;
k
¼
1
;
2
; ...;
M
Þ
ð
8
Þ
ip
i
where W ip is the connection weight between the ith neuron in the (k
ith layer and
pth neuron in the kth layer, y p the output of the pth neuron in the kth layer, sgm p the
sigmoid activation function of the pth neuron in the kth layer and
k
p is the threshold
of the pth neuron in the kth layer. Sigmoid activation function is given as:
b
1
sgm ð x Þ¼
ð
9
Þ
1
þ
exp
ðÞ
x
Training process of the back propagation algorithm runs according to the fol-
lowing steps (Oh 2010 ; Wang et al. 2010 ):
1. Initialize all weights at random.
2. Calculate the output vector.
3. Calculate the error propagation terms.
4. Update the weights by using Eq. ( 10 ).
5. Calculate the total error
by using Eq. ( 11 ).
6. Iterate the calculation by returning to error is less than the
''e”
Search WWH ::




Custom Search