New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure
成果类型:
Article
署名作者:
Freund, Robert M.; Lu, Haihao
署名单位:
Massachusetts Institute of Technology (MIT); Massachusetts Institute of Technology (MIT)
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-017-1164-1
发表日期:
2018
页码:
445-477
关键词:
weak sharp minima
摘要:
Motivated by recent work of Renegar (A framework for applying subgradient methods to conic optimization problems, 2015) we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem , where we presume knowledge of a strict lower bound . [Indeed, is naturally known when optimizing many loss functions in statistics and machine learning (least-squares, logistic loss, exponential loss, total variation loss, etc.) as well as in Renegar's transformed version of the standard conic optimization problem 2015; in all these cases one has .] We introduce a new functional measure called the growth constant G for , that measures how quickly the level sets of grow relative to the function value, and that plays a fundamental role in the complexity analysis. When is non-smooth, we present new computational guarantees for the Subgradient Descent Method and for smoothing methods, that can improve existing computational guarantees in several ways, most notably when the initial iterate is far from the optimal solution set. When is smooth, we present a scheme for periodically restarting the Accelerated Gradient Method that can also improve existing computational guarantees when is far from the optimal solution set, and in the presence of added structure we present a scheme using parametrically increased smoothing that further improves the associated computational guarantees.