Digital Signal Processing Reference
In-Depth Information
Number of layers
Thus 16 parameters
Thus 8 parameters at the output
Bias values: Each of the36
neurons has an adjustable
threshold. This property
makes the neural network
more efficient.
Bias values
From each of the 16 neurons in layer 1, 12 connections go to layer 2
weights
overall
16 * 12
connections
or
weights
overall
12 * 8
connections
or
weights
layer 3
layer 2
layer 1
From each oft the12 neurons in layer 2
8 connections/weights are going to layer 3
Illustration 288: Structure and network data of a nn file
The neural network generated from this nn file will have 16 input neurons in layer 1, 8 output neurons in
layer 3. Due to the automated calculation of the network, the hidden layer here has 12 neurons. Each
neuron is connected to each neuron of the subsequent layer. As a result, there are 16 < 12 = 172 connections
and weightings between layer 1 and layer2, between layer 2 and 3 12 < 8 = 96 connections and weightings.
The Illustration includes further explanations.
The number of output neurons results directly from the number of target values given to
the input N of the “collect network data” module. An XOR network therefore always has
two input neurons and one output neuron.
A glimpse behind the scenes of network design
The menu of the “collect network data” module includes some more adjustable
parameters and Illustrations on the back propagation process. The decisive process is the
transformation of a *.nnd file by means of training into a *.nn network file. Here is more
information in this connection:
 
Search WWH ::




Custom Search