Faster first-order primal-dual methods for linear programming using restarts and sharpness

成果类型:
Article
署名作者:
Applegate, David; Hinder, Oliver; Lu, Haihao; Lubin, Miles
署名单位:
Alphabet Inc.; Google Incorporated; Pennsylvania Commonwealth System of Higher Education (PCSHE); University of Pittsburgh; University of Chicago
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-022-01901-9
发表日期:
2023
页码:
133-184
关键词:
gradient methods Variational Inequality subgradient methods monotone-operators Complexity theory CONVERGENCE optimization algorithms systems
摘要:
First-order primal-dual methods are appealing for their low memory overhead, fast iterations, and effective parallelization. However, they are often slow at finding high accuracy solutions, which creates a barrier to their use in traditional linear programming (LP) applications. This paper exploits the sharpness of primal-dual formulations of LP instances to achieve linear convergence using restarts in a general setting that applies to alternating direction method of multipliers (ADMM), primal-dual hybrid gradient method (PDHG) and extragradient method (EGM). In the special case of PDHG, without restarts we show an iteration count lower bound of Omega(kappa(2) log(1/epsilon)), while with restarts we show an iteration count upper bound of O(epsilon log(1/epsilon)), where kappa is a condition number and epsilon is the desired accuracy. Moreover, the upper bound is optimal for a wide class of primal-dual methods, and applies to the strictly more general class of sharp primal-dual problems. We develop an adaptive restart scheme and verify that restarts significantly improve the ability of PDHG, EGM, and ADMM to find high accuracy solutions to LP problems.
来源URL: