Optimized first-order methods for smooth convex minimization

成果类型:
Article
署名作者:
Kim, Donghwan; Fessler, Jeffrey A.
署名单位:
University of Michigan System; University of Michigan
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-015-0949-3
发表日期:
2016
页码:
81-107
关键词:
摘要:
We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle (Math Program 145(1-2):451-482, 2014. doi: 10.1007/s10107-013-0653-010.1007/s10107-013-0653-0 TargetType=) recently described a numerical method for computing the N-iteration optimal step coefficients in a class of first-order algorithms that includes gradient methods, heavy-ball methods (Polyak in USSR Comput Math Math Phys 4(5):1-17, 1964. doi: 10.1016/0041-5553(64)90137-510.1016/0041-5553(64)90137-5 TargetType=DOI), and Nesterov's fast gradient methods (Nesterov in Sov Math Dokl 27(2):372-376, 1983; Math Program 103(1):127-152, 2005. doi: 10.1007/s10107-004-0552-510.1007/s10107-004-0552-5 TargetType=). However, the numerical method in Drori and Teboulle (2014) is computationally expensive for large N, and the corresponding numerically optimized first-order algorithm in Drori and Teboulle (2014) requires impractical memory and computation for large-scale optimization problems. In this paper, we propose optimized first-order algorithms that achieve a convergence bound that is two times smaller than for Nesterov's fast gradient methods; our bound is found analytically and refines the numerical bound in Drori and Teboulle (2014). Furthermore, the proposed optimized first-order methods have efficient forms that are remarkably similar to Nesterov's fast gradient methods.