Information Technology Reference
In-Depth Information
10.3 Solving Problems with GEP Neural Networks
The problems chosen to illustrate the evolution of linearly encoded neural
networks are two well-known problems of logic synthesis. The first one, the
exclusive-or problem, was chosen both for its historical importance in the
neural network field and for its simplicity. The second one, the 6-bits multi-
plexer, is a rather complex problem and can be useful in evaluating the effi-
ciency of the GEP-NN algorithm.
10.3.1 Neural Network for the Exclusive-or Problem
The XOR is a simple Boolean function of two activities and, therefore, can be
easily solved using linearly encoded neural networks.
For this study, two different experiments were designed in order to illus-
trate the advantages of having the algorithm choosing and evolving the neu-
ral network architecture (Table 10.1). The first one uses a fairly redundant
organization in order to give us an idea about the kinds of solutions that can
be designed for this particular function; and the second uses a much more
compact organization in order to force the system into designing only the
most parsimonious solutions.
The functions used to solve this problem have connectivities 2, 3, and 4,
and are represented, respectively, by “D”, “T” and “Q”. For the experiment
summarized in the first column of Table 10.1, an h = 4 was chosen, which,
together with the high connectivity level chosen for this problem, allows the
discovery of hundreds of different perfect solutions to the XOR function.
Most of them are more complicated than the conventional solution with seven
nodes presented on page 384; others have the same degree of complexity
evaluated in terms of total nodes; but others are, surprisingly, more parsimo-
nious than the aforementioned conventional solution to the XOR function.
Consider, for instance, the first perfect solution found in the experiment
summarized in the first column of Table 10.1:
012345678901234567890123456789012
TQaTaaababbbabaaa6305728327795806
W = {1.175, 0.315, -0.738, 1.694, -1.215, 1.956, -0.342, 1.088, -1.694, 1.288}
As you can see in Figure 10.5, it is a rather complicated solution to the XOR
function, as it uses neurons with a relatively high number of connections.
The reason for choosing this flamboyant architecture for this simple
Search WWH ::




Custom Search