Perturbed Fenchel duality and first-order methods

成果类型:
Article
署名作者:
Gutman, David H.; Pena, Javier F.
署名单位:
Texas Tech University System; Texas Tech University; Carnegie Mellon University
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-022-01779-7
发表日期:
2023
页码:
443-469
关键词:
gradient methods CONVERGENCE optimization minimization subgradient algorithm descent
摘要:
We show that the iterates generated by a generic first-order meta-algorithm satisfy a canonical perturbed Fenchel duality inequality. The latter in turn readily yields a unified derivation of the best known convergence rates for various popular first-order algorithms including the conditional gradient method as well as the main kinds of Bregman proximal methods: subgradient, gradient, fast gradient, and universal gradient methods.