Randomized Block-Coordinate Optimistic Gradient Algorithms for Root-Finding Problems
成果类型:
Article; Early Access
署名作者:
Tran-Dinh, Quoc; Luo, Yang
署名单位:
University of North Carolina; University of North Carolina Chapel Hill; University of North Carolina School of Medicine
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2024.0414
发表日期:
2025
关键词:
backward splitting method
variational-inequalities
monotone-operators
CONVERGENCE
inclusions
EFFICIENCY
摘要:
In this paper, we develop two new randomized block-coordinate optimistic gradient algorithms to approximate a solution of nonlinear equations in large-scale settings, which are known as root-finding problems. Our first algorithm is nonaccelerated with constant step sizes and achieves a O(1/k) best-iterate convergence rate on E[& Vert;Gxk & Vert;2] when the underlying operator G is Lipschitz continuous and possesses a weak Minty solution, in which E[] is the expectation and k is the iteration counter. Our second method is a new accelerated randomized block-coordinate optimistic gradient algorithm. We establish both O(1/k2) and o(1/k2) last-iterate convergence rates on both E[& Vert;Gxk & Vert;2] and E[& Vert;xk+1-xk & Vert;2] for this algorithm under the co-coercivity of G. In addition, we prove that the iterate sequence {xk} converges to a solution almost surely and & Vert;Gxk & Vert; attains a o(1/k) almost sure convergence rate. Then, we apply our methods to a class of large-scale finite-sum inclusions, which covers prominent applications in machine learning, statistical learning, and network optimization, especially in federated learning. We obtain two new federated learning-type algorithms and their convergence rate guarantees for solving this problem class. We test our algorithms on four numerical examples using both synthetic and real data and compare them with related methods. Our numerical experiments show some promising performance of the proposed methods against their competitors.
来源URL: