Incremental proximal methods for large scale convex optimization

成果类型:
Article
署名作者:
Bertsekas, Dimitri P.
署名单位:
Massachusetts Institute of Technology (MIT)
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-011-0472-0
发表日期:
2011
页码:
163-195
关键词:
least-squares thresholding algorithm subgradient methods monotone-operators gradient methods projection CONVERGENCE noise SUM
摘要:
We consider the minimization of a sum Sigma(m)(i=1) f(i)(x) consisting of a large number of convex component functions f(i). For this problem, incremental methods consisting of gradient or subgradient iterations applied to single components have proved very effective. We propose new incremental methods, consisting of proximal iterations applied to single components, as well as combinations of gradient, subgradient, and proximal iterations. We provide a convergence and rate of convergence analysis of a variety of such methods, including some that involve randomization in the selection of components. We also discuss applications in a few contexts, including signal processing and inference/machine learning.