Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization

成果类型:
Article
署名作者:
Hua, Xiaoqin; Yamashita, Nobuo
署名单位:
Jiangsu University of Science & Technology; Kyoto University
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-015-0969-z
发表日期:
2016
页码:
1-32
关键词:
descent method CONVERGENCE algorithms Lasso
摘要:
In this paper, we propose a class of block coordinate proximal gradient (BCPG) methods for solving large-scale nonsmooth separable optimization problems. The proposed BCPG methods are based on the Bregman functions, which may vary at each iteration. These methods include many well-known optimization methods, such as the quasi-Newton method, the block coordinate descent method, and the proximal point method. For the proposed methods, we establish their global convergence properties when the blocks are selected by the Gauss-Seidel rule. Further, under some additional appropriate assumptions, we show that the convergence rate of the proposed methods is R-linear. We also present numerical results for a new BCPG method with variable kernels for a convex problem with separable simplex constraints.