Proximal Gradient Methods with Adaptive Subspace Sampling
成果类型:
Article
署名作者:
Grishchenko, Dmitry; Iutzeler, Franck; Malick, Jerome
署名单位:
Communaute Universite Grenoble Alpes; Institut National Polytechnique de Grenoble; Universite Grenoble Alpes (UGA); Centre National de la Recherche Scientifique (CNRS); Inria; Communaute Universite Grenoble Alpes; Institut National Polytechnique de Grenoble; Universite Grenoble Alpes (UGA); Centre National de la Recherche Scientifique (CNRS); Centre National de la Recherche Scientifique (CNRS)
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2020.1092
发表日期:
2021
页码:
1303-1323
关键词:
Optimization
algorithm
sparsity
identification
CONVERGENCE
smoothness
摘要:
Many applications in machine learning or signal processing involve nonsmooth optimization problems. This nonsmoothness brings a low-dimensional structure to the optimal solutions. In this paper, we propose a randomized proximal gradient method harnessing this underlying structure. We introduce two key components: (i) a random subspace proximal gradient algorithm; and (ii) an identification-based sampling of the subspaces. Their interplay brings a significant performance improvement on typical learning problems in terms of dimensions explored.
来源URL: