Geology Reference
In-Depth Information
this topic, such as arti
flexible nonlinear
estimation methods, such as kernel regression models (SVM) and smoothing
splines, are susceptible to either over
cial neural networks (ANNs), and other
tting is mainly
because of design or training incompetence of the modeler. A network which is not
suf
tting or under
tting. Under
ciently complex to handle nonlinear data processing may fail to detect the full
characteristics of the signal which leads to under
tting. If networks are too com-
plex, it would lead to a dangerous situation called over
tting, which would give
better predictions in the training data and poor predictive results to future values.
The complexity normally connected with the complexity of a network is related to
both the size of the weights and the number of hidden units and layers. Apart from
that, model input selection and training data length in
tting of
nonlinear models. Overtraining can be detected during training by the use of a test
set. However, the disadvantage of this split technique is that the size of the training
set reduces considerably in limited data cases, and thereby spoils the
uence the over
final perfor-
mance. Another easily adoptable approach is to rotate parts of the available data sets
as the training set and the test set. In some cases, strong nonlinearity of the problem
may lead to over
tting, which is easily noticeable from the size of the weights.
Figure 2.1 presents an example of variation of model performance under scenarios
such as over
tting, and a reasonable model during calibration (train-
ing) and testing (validation) phases.
However, there are some standard techniques to tackle overfitting to some
extent, which are brie
tting, under
y described here.
1. Proper selection of model input structure and training data length: Tackled in
this topic and discussed through case studies and in the next chapter.
2. Jittering: Somewhat similar to data enrichment. In this method, an arti
cial
noise deliberately added to the inputs during training. A good example in
Fig. 2.1 Illustration of performance of an overfitted model, underfitted model and a reasonable
model
 
Search WWH ::




Custom Search