Empirical bias-reducing adjustments to estimating functions

成果类型:
Article
署名作者:
Kosmidis, Ioannis; Lunardon, Nicola
署名单位:
University of Warwick; Universita Ca Foscari Venezia
刊物名称:
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY
ISSN/ISSBN:
1369-7412
DOI:
10.1093/jrsssb/qkad083
发表日期:
2024
页码:
62-89
关键词:
GENERALIZED LINEAR-MODELS likelihood reduction regression bootstrap inference
摘要:
We develop a novel, general framework for reduced-bias M-estimation from asymptotically unbiased estimating functions. The framework relies on an empirical approximation of the bias by a function of derivatives of estimating function contributions. Reduced-bias M-estimation operates either implicitly, solving empirically adjusted estimating equations, or explicitly, subtracting the estimated bias from the original M-estimates, and applies to partially or fully specified models with likelihoods or surrogate objectives. Automatic differentiation can abstract away the algebra required to implement reduced-bias M-estimation. As a result, the bias-reduction methods, we introduce have broader applicability, straightforward implementation, and less algebraic or computational effort than other established bias-reduction methods that require resampling or expectations of products of log-likelihood derivatives. If M-estimation is by maximising an objective, then there always exists a bias-reducing penalised objective. That penalised objective relates to information criteria for model selection and can be enhanced with plug-in penalties to deliver reduced-bias M-estimates with extra properties, like finiteness for categorical data models. Inferential procedures and model selection procedures for M-estimators apply unaltered with the reduced-bias M-estimates. We demonstrate and assess the properties of reduced-bias M-estimation in well-used, prominent modelling settings of varying complexity.
来源URL: