Skip to main content

Regression model selection—a residual likelihood approach

Buy Article:

$51.00 plus tax (Refund Policy)

Abstract:

Summary. We obtain the residual information criterion RIC, a selection criterion based on the residual log‐likelihood, for regression models including classical regression models, Box–Cox transformation models, weighted regression models and regression models with autoregressive moving average errors. We show that RIC is a consistent criterion, and that simulation studies for each of the four models indicate that RIC provides better model order choices than the Akaike information criterion, corrected Akaike information criterion, final prediction error, C p and R adj 2, except when the sample size is small and the signal‐to‐noise ratio is weak. In this case, none of the criteria performs well. Monte Carlo results also show that RIC is superior to the consistent Bayesian information criterion BIC when the signal‐to‐noise ratio is not weak, and it is comparable with BIC when the signal‐to‐noise ratio is weak and the sample size is large.

Keywords: Akaike information criterion; Bayesian information criterion; C p; Corrected Akaike information criterion; Residual information criterion; Residual likelihood

Document Type: Research Article

DOI: https://doi.org/10.1111/1467-9868.00335

Publication date: 2002-05-01

  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more