Optimal complexity and certification of Bregman first-order methods

成果类型:
Article
署名作者:
Dragomir, Radu-Alexandru; Taylor, Adrien B.; D'Aspremont, Alexandre; Bolte, Jerome
署名单位:
Universite de Toulouse; Universite Toulouse 1 Capitole; Universite PSL; Ecole Normale Superieure (ENS); Inria; Universite PSL; Ecole Normale Superieure (ENS); Universite PSL; Ecole Normale Superieure (ENS); Centre National de la Recherche Scientifique (CNRS); Universite PSL; Ecole Normale Superieure (ENS); Centre National de la Recherche Scientifique (CNRS); CNRS - Institute for Information Sciences & Technologies (INS2I); Universite de Toulouse; Universite Toulouse 1 Capitole; Toulouse School of Economics
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-021-01618-1
发表日期:
2022
页码:
41-83
关键词:
lipschitz gradient continuity mirror descent convex optimization minimization subgradient CONVERGENCE performance algorithm
摘要:
We provide a lower bound showing that the O(1/k) convergence rate of the NoLips method (a.k.a. Bregman Gradient or Mirror Descent) is optimal for the class of problems satisfying the relative smoothness assumption. This assumption appeared in the recent developments around the Bregman Gradient method, where acceleration remained an open issue. The main inspiration behind this lower bound stems from an extension of the performance estimation framework of Drori and Teboulle (Mathematical Programming, 2014) to Bregman first-order methods. This technique allows computing worst-case scenarios for NoLips in the context of relatively-smooth minimization. In particular, we used numerically generated worst-case examples as a basis for obtaining the general lower bound.