Model selection principles in misspecified models
成果类型:
Article
署名作者:
Lv, Jinchi; Liu, Jun S.
署名单位:
University of Southern California; Harvard University
刊物名称:
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY
ISSN/ISSBN:
1369-7412
DOI:
10.1111/rssb.12023
发表日期:
2014
页码:
141-167
关键词:
bayesian information criterion
generalized linear-models
dimensional feature space
PENALIZED LIKELIHOOD
variable selection
Consistency
complexity
摘要:
Model selection is of fundamental importance to high dimensional modelling featured in many contemporary applications. Classical principles of model selection include the Bayesian principle and the Kullback-Leibler divergence principle, which lead to the Bayesian information criterion and Akaike information criterion respectively, when models are correctly specified. Yet model misspecification is unavoidable in practice. We derive novel asymptotic expansions of the two well-known principles in misspecified generalized linear models, which give the generalized Bayesian information criterion and generalized Akaike information criterion. A specific form of prior probabilities motivated by the Kullback-Leibler divergence principle leads to the generalized Bayesian information criterion with prior probability, GBICp, which can be naturally decomposed as the sum of the negative maximum quasi-log-likelihood, a penalty on model dimensionality, and a penalty on model misspecification directly. Numerical studies demonstrate the advantage of the new methods for model selection in both correctly specified and misspecified models.
来源URL: