Rates of superlinear convergence for classical quasi-Newton methods

成果类型:
Article
署名作者:
Rodomanov, Anton; Nesterov, Yurii
署名单位:
Universite Catholique Louvain; Universite Catholique Louvain
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-021-01622-5
发表日期:
2022
页码:
159-190
关键词:
摘要:
We study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that asymptotically these methods converge superlinearly, the corresponding rates of convergence still remain unknown. In this paper, we address this problem. We obtain first explicit non-asymptotic rates of superlinear convergence for the standard quasi-Newton methods, which are based on the updating formulas from the convex Broyden class. In particular, for the well-known DFP and BFGS methods, we obtain the rates of the form (nL(2)/mu(2)k)(k/2) and (nL/mu k)(k/2) respectively, where k is the iteration counter, n is the dimension of the problem, mu is the strong convexity parameter, and L is the Lipschitz constant of the gradient.