Complementary composite minimization, small gradients in general norms, and applications

成果类型:
Article
署名作者:
Diakonikolas, Jelena; Guzman, Cristobal
署名单位:
University of Wisconsin System; University of Wisconsin Madison; Pontificia Universidad Catolica de Chile; Pontificia Universidad Catolica de Chile
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-023-02040-5
发表日期:
2024
页码:
319-363
关键词:
1st-order methods convex-optimization STABILITY
摘要:
Composite minimization is a powerful framework in large-scale convex optimization, based on decoupling of the objective function into terms with structurally different properties and allowing for more flexible algorithmic design. We introduce a new algorithmic framework for complementary composite minimization, where the objective function decouples into a (weakly) smooth and a uniformly convex term. This particular form of decoupling is pervasive in statistics and machine learning, due to its link to regularization. The main contributions of our work are summarized as follows. First, we introduce the problem of complementary composite minimization in general normed spaces; second, we provide a unified accelerated algorithmic framework to address broad classes of complementary composite minimization problems; and third, we prove that the algorithms resulting from our framework are near-optimal in most of the standard optimization settings. Additionally, we show that our algorithmic framework can be used to address the problem of making the gradients small in general normed spaces. As a concrete example, we obtain a nearly-optimal method for the standard l(1) (small gradients in the l(infinity) norm), essentially matching the bound of Nesterov (Optima Math Optim Soc Newsl 88:10-11, 2012) that was previously known only for the Euclidean setup. Finally, we show that our composite methods are broadly applicable to a number of regression and other classes of optimization problems, where regularization plays a key role. Our methods lead to complexity bounds that are either new or match the best existing ones.