ON LEAST SQUARES ESTIMATION UNDER HETEROSCEDASTIC AND HEAVY-TAILED ERRORS
成果类型:
Article
署名作者:
Kuchibhotla, Arun K.; Patra, Rohit K.
署名单位:
Carnegie Mellon University; State University System of Florida; University of Florida
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/21-AOS2105
发表日期:
2022
页码:
277-302
关键词:
empirical risk minimization
central limit-theorems
Oracle Inequalities
bounds
rates
regression
CONVERGENCE
entropy
摘要:
We consider least squares estimation in a general nonparametric regression model where the error is allowed to depend on the covariates. The rate of convergence of the least squares estimator (LSE) for the unknown regression function is well studied when the errors are sub-Gaussian. We find upper bounds on the rates of convergence of the LSE when the error has a uniformly bounded conditional variance and has only finitely many moments. Our upper bound on the rate of convergence of the LSE depends on the moment assumptions on the error, the metric entropy of the class of functions involved and the local structure of the function class around the truth. We find sufficient conditions on the error distribution under which the rate of the LSE matches the rate of the LSE under sub-Gaussian error. Our results are finite sample and allow for heteroscedastic and heavy-tailed errors.