OSGA: a fast subgradient algorithm with optimal complexity

成果类型:
Article
署名作者:
Neumaier, Arnold
署名单位:
University of Vienna
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-015-0911-4
发表日期:
2016
页码:
1-21
关键词:
gradient methods convex minimization SPARSE
摘要:
This paper presents an algorithm for approximately minimizing a convex function in simple, not necessarily bounded convex, finite-dimensional domains, assuming only that function values and subgradients are available. No global information about the objective function is needed apart from a strong convexity parameter (which can be put to zero if only convexity is known). The worst case number of iterations needed to achieve a given accuracy is independent of the dimension and-apart from a constant factor-best possible under a variety of smoothness assumptions on the objective function.