Variable metric random pursuit

成果类型:
Article
署名作者:
Stich, S. U.; Mueller, C. L.; Gaertner, B.
署名单位:
Swiss Federal Institutes of Technology Domain; ETH Zurich; Universite Catholique Louvain; Simons Foundation
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-015-0908-z
发表日期:
2016
页码:
549-579
关键词:
convergence conditions COVARIANCE-MATRIX optimization adaptation
摘要:
We consider unconstrained randomized optimization of smooth convex objective functions in the gradient-free setting. We analyze Random Pursuit (RP) algorithms with fixed (F-RP) and variable metric (V-RP). The algorithms only use zeroth-order information about the objective function and compute an approximate solution by repeated optimization over randomly chosen one-dimensional subspaces. The distribution of search directions is dictated by the chosen metric. Variable Metric RP uses novel variants of a randomized zeroth-order Hessian approximation scheme recently introduced by Leventhal and Lewis (Optimization 60(3):329-345, 2011. doi:10.1080/02331930903100141). We here present (1) a refined analysis of the expected single step progress of RP algorithms and their global convergence on (strictly) convex functions and (2) novel convergence bounds for V-RP on strongly convex functions. We also quantify how well the employed metric needs to match the local geometry of the function in order for the RP algorithms to converge with the best possible rate. Our theoretical results are accompanied by numerical experiments, comparing V-RP with the derivative-free schemes CMA-ES, Implicit Filtering, Nelder-Mead, NEWUOA, Pattern-Search and Nesterov's gradient-free algorithms.