Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method

成果类型:
Article; Early Access
署名作者:
Doikov, Nikita
署名单位:
Swiss Federal Institutes of Technology Domain; Ecole Polytechnique Federale de Lausanne
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-025-02270-9
发表日期:
2025
关键词:
case evaluation complexity cubic regularization optimization inexact
摘要:
We study the composite convex optimization problems with a quasi-self-concordant smooth component. This problem class naturally interpolates between classic self-concordant functions and functions with Lipschitz continuous Hessian. Previously, the best complexity bounds for this problem class were associated with trust-region schemes and implementations of a ball optimization oracle. In this paper, we show that for minimizing quasi-self-concordant functions we can use instead the basic Newton method with gradient regularization. For unconstrained minimization, it only involves a simple matrix inversion operation (solving a linear system) at each step. We prove a fast global linear rate for this algorithm, matching the complexity bound of the trust-region scheme, while our method remains especially simple to implement. Then, we introduce the dual Newton method, and based on it, develop the corresponding accelerated Newton scheme for this problem class. This scheme further improves the complexity factor of the basic method, matching-up to logarithmic factors-the state-of-the-art rates of accelerated methods achieved within the framework of the ball optimization oracle. As a direct consequence of our results, we establish fast global linear rates of simple variants of the Newton method applied to several practical problems, including logistic regression, soft maximum, and matrix scaling, without requiring additional assumptions on strong or uniform convexity for the target objective.
来源URL: