Information Technology Reference
In-Depth Information
regression to classification problems to logic synthesis, for GEP-nets are as
versatile as all the GEP systems we have studied so far. I will, however,
restrict my presentation here to logic synthesis and the chapter closes with
two illustrative Boolean problems: the simple exclusive-or function and the
much more complex 6-multiplexer function.
10.1 Genes with Multiple Domains for NN Simulation
A neural network with all its elements is a rather complex structure, not easily
constructed and/or trained to perform a certain task. Consequently, it is common
to use sophisticated algorithms in which a genetic algorithm is used to evolve
partial aspects of neural networks, such as the weights, the thresholds, and
the neural network architecture (see Whitley and Schaffer 1992 for a collec-
tion of articles on neural networks and genetic algorithms).
Due to the simplicity and plasticity of gene expression programming, it is
possible to fully encode complex neural networks of different sizes and shapes
in linear chromosomes of fixed length. Indeed, by expressing them, these
complex structures become fully functional and, therefore, they can grow
and adapt in a particular training environment and then be selected accord-
ing to fitness in that particular environment. And this means that populations
of these entities can be used to explore a solution landscape and, therefore,
evolve solutions virtually to all kinds of problems.
In GEP nets, the network architecture is encoded in the familiar structure
of a head/tail domain. The head contains special functions (neurons) that
activate the hidden and output units (in the GEP context, more appropriately
called functional units) and terminals that represent the input units. The tail
contains obviously only terminals. Besides the head and the tail, these genes
(neural network genes or NN-genes) contain two additional domains, Dw
and Dt, encoding, respectively, the weights and the thresholds of the neural
network encoded in the head/tail domain. Structurally, the Dw comes after
the tail and its length d w depends on the head size h and maximum arity n max
and is evaluated by the expression h·n max . The Dt comes after Dw and has a
length d t equal to h . Both domains are composed of symbols representing the
weights and thresholds of the neural network.
For each NN-gene, the weights and thresholds are created at the begin-
ning of each run, but their circulation is guaranteed by the usual genetic
operators of mutation, inversion, transposition, and recombination.
Search WWH ::




Custom Search