Conservative SPDEs as fluctuating mean field limits of stochastic gradient descent
成果类型:
Article
署名作者:
Gess, Benjamin; Gvalani, Rishabh S.; Konarovskyi, Vitalii
署名单位:
Technical University of Berlin; Max Planck Society; Swiss Federal Institutes of Technology Domain; ETH Zurich; University of Hamburg; National Academy of Sciences Ukraine; Institute of Mathematics of NASU
刊物名称:
PROBABILITY THEORY AND RELATED FIELDS
ISSN/ISSBN:
0178-8051
DOI:
10.1007/s00440-024-01353-6
发表日期:
2025
页码:
1447-1515
关键词:
partial-differential-equations
neural-networks
SYSTEM
deviations
particles
MODEL
摘要:
The convergence of stochastic interacting particle systems in the mean-field limit to solutions of conservative stochastic partial differential equations is established, with optimal rate of convergence. As a second main result, a quantitative central limit theorem for such SPDEs is derived, again, with optimal rate of convergence. The results apply, in particular, to the convergence in the mean-field scaling of stochastic gradient descent dynamics in overparametrized, shallow neural networks to solutions of SPDEs. It is shown that the inclusion of fluctuations in the limiting SPDE improves the rate of convergence, and retains information about the fluctuations of stochastic gradient descent in the continuum limit.
来源URL: