APPROXIMATION BOUNDS FOR RANDOM NEURAL NETWORKS AND RESERVOIR SYSTEMS
成果类型:
Article
署名作者:
Gonon, Lukas; Grigoryeva, Lyudmila; Ortega, Juan-pablo
署名单位:
University of Munich; University of Warwick; Nanyang Technological University
刊物名称:
ANNALS OF APPLIED PROBABILITY
ISSN/ISSBN:
1050-5164
DOI:
10.1214/22-AAP1806
发表日期:
2023
页码:
28-69
关键词:
echo state networks
smooth
摘要:
This work studies approximation based on single-hidden-layer feed -forward and recurrent neural networks with randomly generated internal weights. These methods, in which only the last layer of weights and a few hy-perparameters are optimized, have been successfully applied in a wide range of static and dynamic learning problems. Despite the popularity of this ap-proach in empirical tasks, important theoretical questions regarding the rela-tion between the unknown function, the weight distribution, and the approx-imation rate have remained open. In this work it is proved that, as long as the unknown function, functional, or dynamical system is sufficiently regu-lar, it is possible to draw the internal weights of the random (recurrent) neural network from a generic distribution (not depending on the unknown object) and quantify the error in terms of the number of neurons and the hyperpa-rameters. In particular, this proves that echo state networks with randomly generated weights are capable of approximating a wide class of dynamical systems arbitrarily well and thus provides the first mathematical explanation for their empirically observed success at learning dynamical systems.