A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization

成果类型:
Article
署名作者:
Hong, Mingyi; Chang, Tsung-Hui; Wang, Xiangfeng; Razaviyayn, Meisam; Ma, Shiqian; Luo, Zhi-Quan
署名单位:
University of Minnesota System; University of Minnesota Twin Cities; The Chinese University of Hong Kong, Shenzhen; Shenzhen Research Institute of Big Data; East China Normal University; University of Southern California; University of California System; University of California Davis
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2019.1010
发表日期:
2020
页码:
833-861
关键词:
alternating direction method coordinate descent method proximal gradient-method iteration-complexity convergence analysis splitting method cognitive radio algorithm DECOMPOSITION l(1)-minimization
摘要:
Consider the problem of minimizing the sum of a smooth convex function and a separable nonsmooth convex function subject to linear coupling constraints. Problems of this form arise in many contemporary applications, including signal processing, wireless networking, and smart grid provisioning. Motivated by the huge size of these applications, we propose a new class of first-order primal-dual algorithms called the block successive upper-bound minimization method of multipliers (BSUM-M) to solve this family of problems. The BSUM-M updates the primal variable blocks successively by minimizing locally tight upper bounds of the augmented Lagrangian of the original problem, followed by a gradient-type update for the dual variable in closed form. We show that under certain regularity conditions, and when the primal block variables are updated in either a deterministic or a random fashion, the BSUM-M converges to a point in the set of optimal solutions. Moreover, in the absence of linear constraints and under similar conditions as in the previous result, we show that the randomized BSUM-M (which reduces to the randomized block successive upper-bound minimization method) converges at an asymptotically linear rate without relying on strong convexity.