A simple and general debiased machine learning theorem with finite-sample guarantees
成果类型:
Article
署名作者:
Chernozhukov, V; Newey, W. K.; Singh, R.
署名单位:
Massachusetts Institute of Technology (MIT)
刊物名称:
BIOMETRIKA
ISSN/ISSBN:
0006-3444
DOI:
10.1093/biomet/asac033
发表日期:
2023
页码:
257264
关键词:
confidence-intervals
efficient estimation
regression-models
inference
parameters
摘要:
Debiased machine learning is a meta-algorithm based on bias correction and sample splitting to calculate confidence intervals for functionals, i.e., scalar summaries, of machine learning algorithms. For example, an analyst may seek the confidence interval for a treatment effect estimated with a neural network. We present a non-asymptotic debiased machine learning theorem that encompasses any global or local functional of any machine learning algorithm that satisfies a few simple, interpretable conditions. Formally, we prove consistency, Gaussian approximation and semiparametric efficiency by finite-sample arguments. The rate of convergence is n(-1/2) for global functionals, and it degrades gracefully for local functionals. Our results culminate in a simple set of conditions that an analyst can use to translate modern learning theory rates into traditional statistical inference. The conditions reveal a general double robustness property for ill-posed inverse problems.