Accelerated randomized stochastic optimization

成果类型:
Article
署名作者:
Dippon, J
署名单位:
University of Stuttgart
刊物名称:
ANNALS OF STATISTICS
ISSN/ISSBN:
0090-5364
DOI:
10.1214/aos/1059655913
发表日期:
2003
页码:
1260-1281
关键词:
perturbation gradient approximation CONVERGENCE minima
摘要:
We propose a general class of randomized gradient estimates to be employed in a recursive search for the minimum of an unknown multivariate regression function. Here only two observations per iteration step are used. Special cases include random direction stochastic approximation (Kushner and Clark), simultaneous perturbation stochastic approximation (Spall) and a special kernel based stochastic approximation method (Polyak and Tsybakov). If the unknown regression is p-smooth (p greater than or equal to 2) at the point of minimum, these methods achieve the optimal rate of convergence O(n(-(p-1)/(2p))). For both the classical stochastic approximation scheme (Kiefer and Wolfowitz) and the averaging scheme (Ruppert and Polyak) the related asymptotic distributions are computed.
来源URL: