Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions

成果类型:
Article
署名作者:
Apidopoulos, Vassilis; Aujol, Jean-Francois; Dossal, Charles; Rondepierre, Aude
署名单位:
University of Genoa; Universite de Bordeaux; Centre National de la Recherche Scientifique (CNRS); CNRS - National Institute for Mathematical Sciences (INSMI); Universite de Toulouse; Universite Toulouse III - Paul Sabatier; Universite Federale Toulouse Midi-Pyrenees (ComUE); Institut National des Sciences Appliquees de Toulouse; Centre National de la Recherche Scientifique (CNRS); Universite de Toulouse
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-020-01476-3
发表日期:
2021
页码:
151-193
关键词:
error-bounds
摘要:
In this paper we study the convergence properties of a Nesterov's family of inertial schemes which is a specific case of inertial Gradient Descent algorithm in the context of a smooth convex minimization problem, under some additional hypotheses on the local geometry of the objective function F, such as the growth (or Lojasiewicz) condition. In particular we study the different convergence rates for the objective function and the local variation, depending on these geometric conditions. In this setting we can give optimal convergence rates for this Nesterov scheme. Our analysis shows that there are some situations when Nesterov's family of inertial schemes is asymptotically less efficient than the gradient descent (e.g. in the case when the objective function is quadratic).
来源URL: