6.27 Resampling methods (5): k-Fold Cross-Validation
- James et al. (2013) use Figure 5.5 [p.179] to explain the k-Fold Cross-Validation. Please inspect the figure and explain it in a few words.
Advantages
- Computationally less intensive than LOOCV (Why?)
Q: Is k-Fold Cross-Validation a special case of LOOCV? Why?
What is the advantage of using \(k = 5\) or \(k = 10\) rather than \(k = n\)?
Bias-variance trade-off
- Why is there a bias-variance trade-off? (datasets smaller than LOOCV, but larger than validation set approach)
- Recommendation use k-fold cross-validation using k = 5 or k = 10
References
James, Gareth, Daniela Witten, Trevor Hastie, and Robert Tibshirani. 2013. An Introduction to Statistical Learning: With Applications in R. Springer Texts in Statistics. Springer.