CONVEX REGRESSION IN MULTIDIMENSIONS: SUBOPTIMALITY OF LEAST SQUARES ESTIMATORS
成果类型:
Article
署名作者:
Kur, Gil; Gao, Fuchang; Guntuboyina, Adityanand; Sen, Bodhisattva
署名单位:
Swiss Federal Institutes of Technology Domain; ETH Zurich; University of Idaho; University of California System; University of California Berkeley; Columbia University
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/24-AOS2445
发表日期:
2024
页码:
2791-2815
关键词:
nonparametric approach
convergence-rates
risk bounds
Consistency
entropy
sets
摘要:
Under the usual nonparametric regression model with Gaussian errors, are shown to be suboptimal for estimating a d-dimensional convex function in squared error loss when the dimension d is 5 or larger. The specific function classes considered include: (i) bounded convex functions supported on a polytope (in random design), (ii) Lipschitz convex functions supported on any convex domain (in random design) and (iii) convex functions supported on a polytope (in fixed design). For each of these classes, the risk of the LSE is proved to be of the order n-2/d (up to logarithmic factors) while the minimax risk is n-4/(d+4), when d >= 5. In addition, the first rate of convergence results (worst case and adaptive) for the unrestricted convex LSE are established in fixed design for polytopal domains for all d >= 1. Some new metric entropy results for convex functions are also proved, which are of independent interest.