Adaptive regularization with cubics on manifolds
成果类型:
Article
署名作者:
Agarwal, Naman; Boumal, Nicolas; Bullins, Brian; Cartis, Coralia
署名单位:
Alphabet Inc.; Google Incorporated; Princeton University; Toyota Technological Institute - Chicago; University of Oxford
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-020-01505-1
发表日期:
2021
页码:
85-134
关键词:
riemannian-manifolds
newtons method
optimization
optimality
complexity
摘要:
Adaptive regularization with cubics (ARC) is an algorithm for unconstrained, non-convex optimization. Akin to the trust-region method, its iterations can be thought of as approximate, safe-guarded Newton steps. For cost functions with Lipschitz continuous Hessian, ARC has optimal iteration complexity, in the sense that it produces an iterate with gradient smaller than epsilon in O(1/epsilon 1.5) iterations. For the same price, it can also guarantee a Hessian with smallest eigenvalue larger than -epsilon. In this paper, we study a generalization of ARC to optimization on Riemannian manifolds. In particular, we generalize the iteration complexity results to this richer framework. Our central contribution lies in the identification of appropriate manifold-specific assumptions that allow us to secure these complexity guarantees both when using the exponential map and when using a general retraction. A substantial part of the paper is devoted to studying these assumptions-relevant beyond ARC-and providing user-friendly sufficient conditions for them. Numerical experiments are encouraging.