The Boosted Double-proximal Subgradient Algorithm for nonconvex optimization
成果类型:
Article; Early Access
署名作者:
Aragon-Artacho, Francisco J.; Perez-Aros, Pedro; Torregrosa-Belen, David
署名单位:
Universitat d'Alacant; Universidad de Chile; Universidad de Chile
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-024-02190-0
发表日期:
2025
关键词:
point algorithm
descent methods
CONVERGENCE
difference
minimization
SUM
摘要:
In this paper we introduce the Boosted Double-proximal Subgradient Algorithm (BDSA), a novel splitting algorithm designed to address general structured nonsmooth and nonconvex mathematical programs expressed as sums and differences of composite functions. BDSA exploits the combined nature of subgradients from the data and proximal steps, and integrates a linesearch procedure to enhance its performance. While BDSA encompasses existing schemes proposed in the literature, it extends its applicability to more diverse problem domains. We establish the convergence of BDSA under the Kurdyka-& Lstrok;ojasiewicz property and provide an analysis of its convergence rate. To evaluate the effectiveness of BDSA, we introduce two novel test functions with an abundance of critical points. We conduct comparative evaluations, including algorithms with inertial terms, that illustrate its ability to effectively escape non-optimal critical points. Additionally, we present two practical applications of BDSA for testing its efficacy, namely, a constrained minimum-sum-of-squares clustering problem and a nonconvex generalization of Heron's problem.
来源URL: