Interior gradient and epsilon-subgradient descent methods for constrained convex minimization
成果类型:
Article
署名作者:
Auslender, A; Teboulle, M
署名单位:
Universite Claude Bernard Lyon 1; Tel Aviv University
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.1030.0062
发表日期:
2004
页码:
1-26
关键词:
proximal methods
multiplier methods
CONVERGENCE
algorithms
摘要:
We extend epsilon-subgradient descent methods for unconstrained nonsmooth convex minimization to constrained problems over polyhedral sets, in particular over R-+(p). This is done by replacing the usual squared quadratic regularization term used in subgradient schemes by the logarithmic-quadratic distancelike function recently introduced by the authors. We then obtain interior epsilon-subgradient descent methods, which allow us to provide a natural extension of bundle methods and Polyak's subgradient projection methods for nonsmooth convex minimization. Furthermore, similar extensions are considered for smooth constrained minimization to produce interior gradient descent methods. Global convergence as well as an improved global efficiency estimate are established for these methods within a unifying principle and minimal assumptions on the problem's data.
来源URL: