Geology Reference
In-Depth Information
Fig. 3.9 Data splitting in leave-one-out cross-validation approach
3.6.4 Leave-One-Out Cross-Validation
Leave-one-out cross-validation is the degenerate version of the K-fold cross-vali-
dation method. Research has shown that leave-one-out cross-validation often works
relatively well for estimating generalization error for continuous error functions such
as the MSE, but the performance is very poor in discontinuous error functions. In this
approach, we need to perform N experiments if there are N samples in the data set. In
each experiment we need to use N
1 samples as the training data and the remaining
sample for testing. The pictorial description of this method is given in Fig. 3.9 . Lack
of continuity in the data would be giving adverse results with the leave-one-out
approach. Even a small variation in the data may cause a large change in the model
selected [ 11 ]. However, one study by Shao [ 67 ] has shown that, in the case of linear
models, leave-one-out cross-validation is asymptotically equivalent to AIC and BIC.
3.6.5 Cross-Correlation Method
Unlike the above-mentioned CVA, a cross-correlation method would be useful in
identifying the most in
uencing input data series for a particular output series. This
 
Search WWH ::




Custom Search