Riemannian Anderson Mixing Methods for Minimizing C2 Functions on Riemannian Manifolds

成果类型:
Article; Early Access
署名作者:
Li, Zanyu; Bao, Chenglong
署名单位:
Tsinghua University; Tsinghua University; Yanqi Lake Beijing Institute of Mathematical Sciences & Applications
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2023.0284
发表日期:
2025
关键词:
Optimization methods convergence analysis acceleration algorithms retraction
摘要:
Anderson mixing (AM) method is a popular approach for accelerating fixedpoint iterations by leveraging historical information from previous steps. In this paper, we introduce the Riemannian Anderson mixing (RAM) method, an extension of AM to Riemannian manifolds, and analyze its local linear convergence under reasonable assumptions. Unlike other extrapolation-based algorithms on Riemannian manifolds, RAM does not require computing the inverse retraction or inverse exponential mapping and has a lower per-iteration cost. Furthermore, we propose a variant of RAM called regularized RAM (RRAM), which establishes global convergence and exhibits similar local convergence properties to RAM. Our proof relies on careful error estimations based on the local geometry of Riemannian manifolds. Finally, we present experimental results on various manifold optimization problems that demonstrate the superior performance of our proposed methods over existing Riemannian gradient descent and limited-memory Broyden-FletcherGoldfarb-Shanno (LBFGS) approaches.