Smoothing methods for nonsmooth, nonconvex minimization
成果类型:
Article
署名作者:
Chen, Xiaojun
署名单位:
Hong Kong Polytechnic University
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-012-0569-0
发表日期:
2012
页码:
71-99
关键词:
nonlinear complementarity-problems
stochastic mathematical programs
gradient sampling algorithm
continuation method
Newton method
equilibrium constraints
variable selection
optimization
CONVERGENCE
approximation
摘要:
We consider a class of smoothing methods for minimization problems where the feasible set is convex but the objective function is not convex, not differentiable and perhaps not even locally Lipschitz at the solutions. Such optimization problems arise from wide applications including image restoration, signal reconstruction, variable selection, optimal control, stochastic equilibrium and spherical approximations. In this paper, we focus on smoothing methods for solving such optimization problems, which use the structure of the minimization problems and composition of smoothing functions for the plus function (x)(+). Many existing optimization algorithms and codes can be used in the inner iteration of the smoothing methods. We present properties of the smoothing functions and the gradient consistency of subdifferential associated with a smoothing function. Moreover, we describe how to update the smoothing parameter in the outer iteration of the smoothing methods to guarantee convergence of the smoothing methods to a stationary point of the original minimization problem.