Databases Reference
In-Depth Information
Figure 4.6 Sigmoid Curve (a ¼ 6)
a. Feed the input values forward through the network to generate output
value(s).
b. Compare the predicted output value with the actual output value.
c. Use a methodology known as back propagation to go back through
the neurons, adjusting the weights slightly to improve the network's
predictive performance with respect to the observation in question.
4. Repeat step 3 until a stopping point is reached.
Each execution of step 3 above is called an epoch . Depending on the size of
the training set, sufficient training of the network may require thousands
of epochs, while others may train in a hundred or fewer epochs. Note: The
mathematics of back propagation is not presented in this text. The interested
reader doing an Internet search for “artificial neural network tutorial” will
find a number of excellent tutorials with in-depth explanations of back
propagation.
Overfitting the model
In Chapter 1 the concept of model overfitting was introduced. ANN models
are especially prone to overfitting when care is not taken in their construction.
Given enough training epochs, and sufficient neurons in the hidden layers,
an ANN can be trained to almost perfectly fit any dataset. Consider the points
in Figure 4.7a. A good regression model for the data is represented by the
curve in Figure 4.7b. Yet, an overtrained ANN regression could produce a
model similar to Figure 4.7c. Which model, b or c, would you expect to
generalize better?
 
Search WWH ::




Custom Search