Information Technology Reference
In-Depth Information
in different networks was done up to 12,000 epochs using C RPROP algorithm
(
μ
+
10 ( 6 ) , max
=
.
=
.
, min
=
=
.
, 0
=
.
0005). The C BP
failed to train any network for bilinear transformation, as saturation is observed in
learning. However, with C RPROP algorithm, training, and generalization has been
successfully achieved. The trained networks are able to generalize this mapping
from 'z' plane to ' zz ' plane over varying values of radius of disks from 0.05 to 0.5
at 10 regular intervals. Normalized test input-output mapping in shown in Fig. 5.5 b.
Figure 5.5 c-f present the transformation results of these test patterns with different
networks. The output of C RSP neuron based network displayed in Fig. 5.5 fshows
that it gives best generalization as compared to other networks. The superiority of
C RSP neuron is again seen in this case also.
0
5
1
2
0
001
0
5.3 Concluding Remarks
The linear and bilinear transformations are frequently employed and the example
mentioned here is but a small application of the innumerable existent. As these func-
tions are most frequently applied, the CVNN along with conventional and higher
order neurons are used to address them to study the convergence. The mapping
problems demonstrated in this chapter represent typical functions encountered in
practice. Results in mapping applications demonstrate that the standard CVNN can
be replaced by the higher order neurons presented in earlier chapter for better per-
formance. The complex mapping was found to be sensitive to the normalization of
the input and output patterns. The C AF demand that the data to be mapped be re-
stricted to a range or otherwise, the functions' effective contribution to the weights
diminishes for larger values of the parameter (as the functions become more flat for
large arguments and hence less slope). This restriction for implementing the CVNN
algorithms with the bilinear transformation manifests in the form of a constrained
range on the parameters. For circles of a large radius and lying beyond the unit circle,
a normalization factor must be introduced to restrict the whole output to within the
unit circle before using the CVNN with this activation function.
Simulations on these complex-valued problems clearly presents the robustness
and functional superiority of proposed neurons over conventional neurons. These
neurons make it possible to solve the problems using a smaller network and fewer
learning parameters. Besides, they have also demonstrated faster learning and better
approximation. This has been found in few problems that ANN units are trapped at
early saturation with C BP algorithm, which preclude any significant improvement in
the training weights. This causes an unnecessary increase in the number of iterations
required to train an ANN. This situation is serious in bilinear transformation, where
C BP failed to converge. We have obtained a fine result for this problem with C RSP
and C RPROP. Moreover, the modified C RPROP has thoroughly demonstrated better
performance with drastic reduction in training epochs. Efficient solution provided
in conformal mapping ensures the ability of complex domain neurons to process
magnitude and phase of data/signals properly.
 
Search WWH ::




Custom Search