Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
成果类型:
Article
署名作者:
Laszlo, Szilard Csaba
署名单位:
Technical University of Cluj Napoca
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-020-01534-w
发表日期:
2021
页码:
285-329
关键词:
forward-backward algorithm
proximal algorithm
descent methods
heavy-ball
optimization
ipiano
摘要:
We investigate an inertial algorithm of gradient type in connection with the minimization of a non-convex differentiable function. The algorithm is formulated in the spirit of Nesterov's accelerated convex gradient method. We prove some abstract convergence results which applied to our numerical scheme allow us to show that the generated sequences converge to a critical point of the objective function, provided a regularization of the objective function satisfies the Kurdyka-Lojasiewicz property. Further, we obtain convergence rates for the generated sequences and the objective function values formulated in terms of the Lojasiewicz exponent of a regularization of the objective function. Finally, some numerical experiments are presented in order to compare our numerical scheme and some algorithms well known in the literature.