Measuring the prediction error. A comparison of cross-validation, bootstrap and covariance penalty methods
Autor: | Agostino Di Ciaccio, Simone Borra |
---|---|
Rok vydání: | 2010 |
Předmět: |
Statistics and Probability
Statistics::Theory Bootstrap aggregating leave-one-out cross-validation Cross-validation covariance penalty Resampling Statistics projection pursuit regression Statistics::Methodology bootstrap extra-sample error Mathematics Parametric statistics prediction error Applied Mathematics Estimator Regression analysis regression trees neural networks optimism in-sample error Computational Mathematics Projection pursuit regression Computational Theory and Mathematics Projection pursuit Settore SECS-S/01 - Statistica |
Zdroj: | Computational Statistics & Data Analysis. 54:2976-2989 |
ISSN: | 0167-9473 |
DOI: | 10.1016/j.csda.2010.03.004 |
Popis: | The estimators most widely used to evaluate the prediction error of a non-linear regression model are examined. An extensive simulation approach allowed the comparison of the performance of these estimators for different non-parametric methods, and with varying signal-to-noise ratio and sample size. Estimators based on resampling methods such as Leave-one-out, parametric and non-parametric Bootstrap, as well as repeated Cross Validation methods and Hold-out, were considered. The methods used are Regression Trees, Projection Pursuit Regression and Neural Networks. The repeated-corrected 10-fold Cross-Validation estimator and the Parametric Bootstrap estimator obtained the best performance in the simulations. |
Databáze: | OpenAIRE |
Externí odkaz: |