Primal and dual predicted decrease approximation methods
成果类型:
Article
署名作者:
Beck, Amir; Pauwels, Edouard; Sabach, Shoham
署名单位:
Technion Israel Institute of Technology; Universite de Toulouse; Universite Toulouse III - Paul Sabatier
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-017-1108-9
发表日期:
2018
页码:
37-73
关键词:
conditional gradient-method
support
CONVERGENCE
minimization
algorithms
time
摘要:
We introduce the notion of predicted decrease approximation (PDA) for constrained convex optimization, a flexible framework which includes as special cases known algorithms such as generalized conditional gradient, proximal gradient, greedy coordinate descent for separable constraints and working set methods for linear equality constraints with bounds. The new scheme allows the development of a unified convergence analysis for these methods. We further consider a partially strongly convex nonsmooth model and show that dual application of PDA-based methods yields new sublinear convergence rate estimates in terms of both primal and dual objectives. As an example of an application, we provide an explicit working set selection rule for SMO-type methods for training the support vector machine with an improved primal convergence analysis.
来源URL: