Faster subgradient methods for functions with Holderian growth
成果类型:
Article
署名作者:
Johnstone, Patrick R.; Moulin, Pierre
署名单位:
Rutgers University System; Rutgers University New Brunswick; University of Illinois System; University of Illinois Urbana-Champaign
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-018-01361-0
发表日期:
2020
页码:
417-450
关键词:
weak sharp minima
stochastic-approximation
convergence rate
descent methods
error-bounds
algorithms
regression
vector
摘要:
The purpose of this manuscript is to derive new convergence results for several subgradient methods applied to minimizing nonsmooth convex functions with Holderian growth. The growth condition is satisfied in many applications and includes functions with quadratic growth and weakly sharp minima as special cases. To this end there are three main contributions. First, for a constant and sufficiently small stepsize, we show that the subgradient method achieves linear convergence up to a certain region including the optimal set, with error of the order of the stepsize. Second, if appropriate problem parameters are known, we derive a decaying stepsize which obtains a much faster convergence rate than is suggested by the classical O(1/k) result for the subgradient method. Thirdly we develop a novel descending stairs stepsize which obtains this faster convergence rate and also obtains linear convergence for the special case of weakly sharp functions. We also develop an adaptive variant of the descending stairs stepsize which achieves the same convergence rate without requiring an error bound constant which is difficult to estimate in practice.