Graphical Convergence of Subgradients in Nonconvex Optimization and Learning
成果类型:
Article
署名作者:
Davis, Damek; Drusvyatskiy, Dmitriy
署名单位:
Cornell University; University of Washington; University of Washington Seattle
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2021.1126
发表日期:
2022
页码:
209-231
关键词:
stationary-points
STABILITY
approximation
asymptotics
nonsmooth
composite
set
摘要:
We investigate the stochastic optimization problem of minimizing population risk, where the loss defining the risk is assumed to be weakly convex. Compositions of Lipschitz convex functions with smooth maps are the primary examples of such losses. We analyze the estimation quality of such nonsmooth and nonconvex problems by their sample average approximations. Our main results establish dimension-dependent rates on subgradient estimation in full generality and dimension-independent rates when the loss is a generalized linear model. As an application of the developed techniques, we analyze the nonsmooth landscape of a robust nonlinear regression problem.
来源URL: