Database Reference
In-Depth Information
5. Adding input neurons. This step adds neurons to known input features that are
not referred to in the rules, because the rule set that is not perfectly correct may
not identify every input feature required to correctly learn the concept.
6. Adding links. This step adds links with weight zero to the network using the
neuron numbering established in Step 4.
7. Perturbing. The final step is to perturb the network by adding a small random
number to each of the link weights.
However, the rule-to-network algorithm of KBANN cannot change its
network structure according to the situation of training the records. Even if there
are some records with new features in the database, we cannot find a new rule that
goes beyond the knowledge of medical experts. To find a new rule in the database,
we use the structure-level adaptation of the NN, which develops a suitable
network structure in the Step 7.
7.2.2 Structure-Level Adaptation of the NN [10]
Usual neural networks shows the following behaviors during learning:
z If a neural network does not have enough neurons to infer, the input weight
vector of each neuron may fluctuate greatly, even after an long enough period
of learning.
z If a neural network has more neurons than what is required to infer, then even
after the input weight vector of each neuron converges, the network may have
unnecessary neurons.
In the first case, the network should generate a new neuron inheriting the
attributes of its parent neuron. In the second case, redundant neurons should be
deleted from the network through weight calculation. Based on these cases, we
determine the suitable conditions for neuron generation or annihilation in the
learning process.
7.2.2.1 Neuron Generation
Neuron generation occurs when the representation power of the network is
insufficient. Given enough neurons in a hidden layer, a neural network can map
with precision. Therefore, we used a stabilized error as the index to determine
whether the network needs to generate a new neuron. If a stabilized error after an
adequate period of learning is larger than the desired value, a new neuron is
generated.
The next problem is how to determine the position of the new neuron in the
hidden layer. The optimal position can be found through monitoring the behaviors
of the neurons in the hidden layer. The neuron needs adequate representation
power to contribute to the final system error through the fluctuation in its input
weight vector, because the input weight vector of each neuron fluctuates greatly
even after an adequate period of learning. When the neuron dose not have enough
Search WWH ::




Custom Search