Digital Signal Processing Reference
In-Depth Information
Fig. 13.10 Crossover of
groups of neurons to create
new descendant individuals
parent 1
parent 2
descendant 1
descendant 2
the optimization that is done around a local minimum. Four kinds of mutations can
be done: delete neuron, add neuron, delete layer or duplicate layer. The probability
of mutations to occur during the evolution is constant. Another important genetic
operation is recombination, which stands for the creation of a descendant being an
identical copy of a parent. Of course, this operation is used more and more fre-
quently in the later evolution to make the most adapted individual successful.
The newly created population is trained again and then subjected to quality
determination (grading). The consecutive populations of ANNs are created, trained
and graded until the optimization criterion is fulfilled. The evolutionary process
described should end up in an optimum that represents the most appropriate neural
network topology.
Below
the
genetic
procedure
is
described
with
more
details
and
several
implementation aspects (crucial for final results) are discussed.
Application of the GA principles for ANN structure optimization is an example
of natural coding (not binary, here—real valued). The coding of particular indi-
viduals was very compact—all the ANNs of given population were represented by
the set (record) of the following parameters (genes):
• ANN consecutive number k,
• Number of layers l k ,
• Vector of layer sizes (number of neurones in layers) N k ,
• Matrix of the connection weights W k ,
 
Search WWH ::




Custom Search