ISOTONIC REGRESSION IN GENERAL DIMENSIONS

成果类型:
Article
署名作者:
Han, Qiyang; Wang, Tengyao; Chatterjee, Sabyasachi; Samworth, Richard J.
署名单位:
University of Washington; University of Washington Seattle; University of Cambridge; University of Chicago; University of Illinois System; University of Illinois Urbana-Champaign; University of Cambridge
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/18-AOS1753
发表日期:
2019
页码:
2440-2471
关键词:
least-squares risk bounds Concentration inequalities rates CONVERGENCE adaptation algorithm entropy models
摘要:
We study the least squares regression function estimator over the class of real-valued functions on [0, 1](d) that are increasing in each coordinate. For uniformly bounded signals and with a fixed, cubic lattice design, we establish that the estimator achieves the minimax rate of order n(-min{2/(d+2),1/d} ) in the empirical L-2 loss, up to polylogarithmic factors. Further, we prove a sharp oracle inequality, which reveals in particular that when the true regression function is piecewise constant on k hyperrectangles, the least squares estimator enjoys a faster, adaptive rate of convergence of (k/n)(min(1,2/d)), again up to polylogarithmic factors. Previous results are confined to the case d <= 2. Finally, we establish corresponding bounds (which are new even in the case d = 2) in the more challenging random design setting. There are two surprising features of these results: first, they demonstrate that it is possible for a global empirical risk minimisation procedure to be rate optimal up to polylogarithmic factors even when the corresponding entropy integral for the function class diverges rapidly; second, they indicate that the adaptation rate for shape-constrained estimators can be strictly worse than the parametric rate.