Databases Reference
In-Depth Information
knowledge about the ANN's architecture. It is dicult to say that a rule
is optimal for all ANNs. Hence an ANN should have the ability to adjust
its learning rule adaptively according to its architecture and the task to be
performed. Therefore the evolution of learning rules has been introduced
into ANNs in order to learn their learning rules.
The relationship between evolution and learning is extremely complex.
Various models have been proposed, 84-92 but most of them deal with
the issue of how learning can guide evolution 84,85 and the relationship
between the evolution of architectures and that of connection weights. 86-88
Algorithm 3.4 describes the evolution of learning rules. If the ANN
architecture is predefined and fixed, the evolved learning rule should be
optimized toward this architecture. If a near-optimal learning rule for
different ANN architectures is to be evolved, the fitness evaluation should
be based on the average training result from different ANN architectures
in order to avoid overfitting a particular architecture.
Algorithm 3.4. Evolution of learning rules
(1) Decode each individual into a learning rule.
(2) Construct a set of ANNs with randomly generated architecture and
initial connection weights , and train them using the decoded learning
rule.
(3) Calculate the fitness of each individual according to the average training
result.
(4) Select parents according to their fitness.
(5) Apply search operators to parents to generate offspring which form the
new generation.
3.2.5. Evolution of algorithmic parameters
The adaptation of BP parameters such as the learning rate and momentum
through evolution may be considered as the first step of the evolution
of learning rules. 71,93 Harp et al. 71 evolved the BP's parameters along
with ANN's architecture. The simultaneous evolution of both algorithmic
parameters and architectures facilitate exploration of interactions between
the learning algorithm and architectures such that
a near-optimal
combination of BP with an architecture can be found.
Other researchers 60,78,93 evolved the BP parameters while the ANN's
architecture was kept fixed. The parameters evolved in this case tend to be
Search WWH ::




Custom Search