Digital Signal Processing Reference
In-Depth Information
Table 5.2
Sigmoid
Transforms
Parameters
Multiplications
Additions
Proposed NN
34
10
24
26
MLP
50
10
40
50
The coupling matrix simulated in this example was taken as
1
.
3
H =
.
03
.
1
In this example, the proposed block-oriented NN scheme was composed of two blocks
of N = 5 neurons each. The total number of neurons is then equal to 10. We have com-
pared this scheme to an MLP structure composed of ten neurons. The erf function has
been taken for the activation function.
The complexity of the algorithms is displayed in Table 5.2.
As can be seen in the table, the number of multiplications and additions in the MLP
structure is almost twice that of the proposed block structure.
We have tested both algorithms for a range of μ values belonging to the interval
[10 -5 10 -2 ] under various initial conditions. The proposed block structure has always
outperformed the MLP.
Figure 5.6 shows the learning curves for the block structure and the MLP for μ =
0.005. The proposed approach shows lower MSE and faster convergence speed.
10 -1
10 -2
10 -3
MLP
10 -4
Proposed NN
10 -5
0
20
40 60
× 5,000 Iterations
80
100
FIgure 5.6
Smoothed MSE curves (an averaging window of 1,000 samples has been used). μ =
0.005.
 
 
Search WWH ::




Custom Search