Limited-memory BFGS with displacement aggregation
成果类型:
Article
署名作者:
Berahas, Albert S.; Curtis, Frank E.; Zhou, Baoyu
署名单位:
Lehigh University
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-021-01621-6
发表日期:
2022
页码:
121-157
关键词:
quasi-newton methods
variable-metric methods
superlinear convergence
GLOBAL CONVERGENCE
nonconvex
optimization
algorithm
matrices
FAMILY
摘要:
A displacement aggregation strategy is proposed for the curvature pairs stored in a limited-memory BFGS (a.k.a. L-BFGS) method such that the resulting (inverse) Hessian approximations are equal to those that would be derived from a full-memory BFGS method. This means that, if a sufficiently large number of pairs are stored, then an optimization algorithm employing the limited-memory method can achieve the same theoretical convergence properties as when full-memory (inverse) Hessian approximations are stored and employed, such as a local superlinear rate of convergence under assumptions that are common for attaining such guarantees. To the best of our knowledge, this is the first work in which a local superlinear convergence rate guarantee is offered by a quasi-Newton scheme that does not either store all curvature pairs throughout the entire run of the optimization algorithm or store an explicit (inverse) Hessian approximation. Numerical results are presented to show that displacement aggregation within an adaptive L-BFGS scheme can lead to better performance than standard L-BFGS.