Skip to main content

Model selection principles in misspecified models

Buy Article:

$51.00 plus tax (Refund Policy)


Model selection is of fundamental importance to high dimensional modelling featured in many contemporary applications. Classical principles of model selection include the Bayesian principle and the Kullback–Leibler divergence principle, which lead to the Bayesian information criterion and Akaike information criterion respectively, when models are correctly specified. Yet model misspecification is unavoidable in practice. We derive novel asymptotic expansions of the two well‐known principles in misspecified generalized linear models, which give the generalized Bayesian information criterion and generalized Akaike information criterion. A specific form of prior probabilities motivated by the Kullback–Leibler divergence principle leads to the generalized Bayesian information criterion with prior probability, GBICp, which can be naturally decomposed as the sum of the negative maximum quasi‐log‐likelihood, a penalty on model dimensionality, and a penalty on model misspecification directly. Numerical studies demonstrate the advantage of the new methods for model selection in both correctly specified and misspecified models.

Keywords: Akaike information criterion; Bayesian information criterion; Bayesian principle; Generalized Akaike information criterion; Generalized Bayesian information criterion; Generalized Bayesian information criterion with prior probability; KullbackÔÇôLeibler divergence principle; Model misspecification; Model selection

Document Type: Research Article


Publication date: January 1, 2014


Access Key

Free Content
Free content
New Content
New content
Open Access Content
Open access content
Subscribed Content
Subscribed content
Free Trial Content
Free trial content
Cookie Policy
Cookie Policy
ingentaconnect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more