Non-Euclidean Contraction Analysis of Continuous-Time Neural Networks
成果类型:
Article
署名作者:
Davydov, Alexander; Proskurnikov, Anton V.; Bullo, Francesco
署名单位:
University of California System; University of California Santa Barbara; University of California System; University of California Santa Barbara; Polytechnic University of Turin
刊物名称:
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
ISSN/ISSBN:
0018-9286
DOI:
10.1109/TAC.2024.3422217
发表日期:
2025
页码:
235-250
关键词:
Artificial neural networks
computational modeling
stability analysis
mathematical models
Biological neural networks
asymptotic stability
Machine Learning
Contraction theory
Neural Networks
stability of nonlinear systems
摘要:
Critical questions in dynamical neuroscience and machine learning are related to the study of continuous-time neural networks and their stability, robustness, and computational efficiency. These properties can be simultaneously established via a contraction analysis. This article develops a comprehensive non-Euclidean contraction theory for continuous-time neural networks. Specifically, we provide novel sufficient conditions for the contractivity of general classes of continuous-time neural networks including Hopfield, firing rate, Persidskii, Lur'e, and other neural networks with respect to the non-Euclidean & ell;(1)/& ell;(infinity) norms. These sufficient conditions are based upon linear programming or, in some special cases, establishing the Hurwitzness of a particular Metzler matrix. To prove these sufficient conditions, we develop novel results on non-Euclidean logarithmic norms and a novel necessary and sufficient condition for contractivity of systems with locally Lipschitz dynamics. For each model, we apply our theoretical results to compute the optimal contraction rate and corresponding weighted non-Euclidean norm with respect to which the neural network is contracting.
来源URL: