Efficiency of minimizing compositions of convex functions and smooth maps
成果类型:
Article
署名作者:
Drusvyatskiy, D.; Paquette, C.
署名单位:
University of Washington; University of Washington Seattle
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-018-1311-3
发表日期:
2019
页码:
503-558
关键词:
least-squares
CONVERGENCE
algorithm
minimization
regression
摘要:
We consider global efficiency of algorithms for minimizing a sum of a convex function and a composition of a Lipschitz convex function with a smooth map. The basic algorithm we rely on is the prox-linear method, which in each iteration solves a regularized subproblem formed by linearizing the smooth map. When the subproblems are solved exactly, the method has efficiency O(epsilon(-2)), akin to gradient descent for smooth minimization. We show that when the subproblems can only be solved by first-order methods, a simple combination of smoothing, the prox-linear method, and a fast-gradient scheme yields an algorithm with complexity (O) over tilde(epsilon(-3)). We round off the paper with an inertial prox-linear method that automatically accelerates in presence of convexity.
来源URL: