Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
成果类型:
Article
署名作者:
Bolte, Jerome; Pauwels, Edouard
署名单位:
Universite de Toulouse; Universite Toulouse 1 Capitole; Toulouse School of Economics; Universite Federale Toulouse Midi-Pyrenees (ComUE); Universite de Toulouse; Institut National Polytechnique de Toulouse; Universite Toulouse III - Paul Sabatier; Centre National de la Recherche Scientifique (CNRS)
刊物名称:
MATHEMATICAL PROGRAMMING
ISSN/ISSBN:
0025-5610
DOI:
10.1007/s10107-020-01501-5
发表日期:
2021
页码:
19-51
关键词:
subdifferentials
approximations
摘要:
Modern problems in AI or in numerical analysis require nonsmooth approaches with a flexible calculus. We introduce generalized derivatives called conservative fields for which we develop a calculus and provide representation formulas. Functions having a conservative field are called path differentiable: convex, concave, Clarke regular and any semialgebraic Lipschitz continuous functions are path differentiable. Using Whitney stratification techniques for semialgebraic and definable sets, our model provides variational formulas for nonsmooth automatic differentiation oracles, as for instance the famous backpropagation algorithm in deep learning. Our differential model is applied to establish the convergence in values of nonsmooth stochastic gradient methods as they are implemented in practice.