Digital Signal Processing Reference
In-Depth Information
FIGURE 10.65. Three-layer neural network with seven nodes.
half of the 128 points resulting from a 128-point FFT of the signal to be recognized.
In recent years, many topics and articles on neural networks have been published
[68,69]. Neural network products are now available from many vendors.
Many different rules have been described in the literature for training a neural
network. The back-error propagation is one of the most widely used for a wide
range of applications. Given a set of input, the network is trained to give a desired
response. If the network gives the wrong answer, then it is corrected by adjusting
its parameters so that the error is reduced. During this correction process, one starts
with the output nodes and propagation is backward to the input nodes (back prop-
agation). Then the propagation process is repeated.
To illustrate the procedure for training a neural network using the back-
propagation rule, consider a simple three-layer network with seven nodes, as shown
in Figure 10.65. The input layer consists of three nodes, and the hidden layer and
output layer, each consists of two nodes. Given the following set of inputs: input No.
1
0 into node 2, the
network is to be trained to yield the desired output 0 at node 0 and 1 at node 1. Let
the subscripts i , j , k be associated with the first, second, and third layers, respectively.
A set of random weights are initially chosen, as shown in Figure 10.65. For example,
=
1 into node 0, input No. 2
=
1 into node 1, and input No. 3
=
Search WWH ::




Custom Search