Globally convergent derivative-free methods in nonconvex optimization with and without noise

成果类型:
Article; Early Access
署名作者:
Khanh, Pham Duy; Mordukhovich, Boris S.; Tran, Dat Ba
署名单位:
Wayne State University
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-025-02255-8
发表日期:
2025
关键词:
descent methods simplex-method algorithm search minimization 1st-order
摘要:
This paper addresses the study of nonconvex derivative-free optimization problems, where only information of either smooth objective functions or their noisy approximations is available. General derivative-free methods are proposed for minimizing differentiable (not necessarily convex) functions with globally Lipschitz continuous gradients, where the accuracy of approximate gradients is interacting with stepsizes and exact gradient values. Analysis in the noiseless case guarantees convergence of the gradient sequence to the origin as well as global convergence with constructive convergence rates of the sequence of iterates under the Kurdyka-& Lstrok;ojasiewicz property. In the noisy case, without any noise level information, the designed algorithms reach near-stationary points with providing estimates on the required number of iterations and function evaluations. Addressing functions with locally Lipschitzian gradients, two algorithms are introduced to handle the noiseless and noisy cases, respectively. The noiseless version is based on the standard backtracking linesearch and achieves fundamental convergence properties similarly to the global Lipschitzian case. The noisy version is based on a novel dynamic step linesearch and is shown to reach near-stationary points after a finite number of iterations when the Polyak-& Lstrok;ojasiewicz inequality is imposed. Numerical experiments are conducted on a diverse set of test problems to demonstrate more robustness of the newly proposed algorithms in comparison with other finite-difference-based schemes and some highly efficient, production-ready codes from the SciPy library. The experiments also demonstrate that the newly proposed methods can be integrated with acceleration techniques from the literature of smooth optimization while significantly enhancing numerical performance and outperforming current state-of-the-art derivative-free algorithms.