A corrected Akaike criterion based on Kullback's symmetric divergence: applications in time series, multiple and multivariate regression
Autor: | A. Mkhadri, B. Hafidi |
---|---|
Rok vydání: | 2006 |
Předmět: |
Statistics and Probability
Applied Mathematics Model selection Univariate Stepwise regression Computational Mathematics Computational Theory and Mathematics Bias of an estimator Bayesian information criterion Linear regression Statistics Akaike information criterion Divergence (statistics) Mathematics |
Zdroj: | Computational Statistics & Data Analysis. 50:1524-1550 |
ISSN: | 0167-9473 |
DOI: | 10.1016/j.csda.2005.01.007 |
Popis: | The evaluation of an Akaike information criterion (AIC), KIC is considered. Kullback information criterion (KIC) is an approximately unbiased estimator for a risk function based on the Kullback's symmetric divergence. However, when the sample size is small, or when it is large and the dimension of the candidate model is relatively small, this criterion displays a large negative bias. To overcome this problem, corrected versions, KICc, of this criterion for univariate autoregressive models and for multiple and multivariate regression models are proposed. Thus, the methodology for AIC and AICc from McQuarrie and Tsai is extended to the KIC criterion. The performance of the new criterion relative to other criteria is examined in a large simulation study. |
Databáze: | OpenAIRE |
Externí odkaz: |