Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
成果类型:
Article
署名作者:
Drusvyatskiy, Dmitriy; Lewis, Adrian S.
署名单位:
University of Washington; University of Washington Seattle; Cornell University
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2017.0889
发表日期:
2018
页码:
919-948
关键词:
2nd-order variational analysis
tilt stability
metric regularity
algorithm
optimality
calculus
摘要:
The proximal gradient algorithm for minimizing the sum of a smooth and nonsmooth convex function often converges linearly even without strong convexity. One common reason is that a multiple of the step length at each iteration may linearly bound the error-the distance to the solution set. We explain the observed linear convergence intuitively by proving the equivalence of such an error bound to a natural quadratic growth condition. Our approach generalizes to linear and quadratic convergence analysis for proximal methods (of Gauss-Newton type) for minimizing compositions of nonsmooth functions with smooth mappings. We observe incidentally that short step-lengths in the algorithm indicate near-stationarity, suggesting a reliable termination criterion.