A general framework for interpretable neural learning based on local information-theoretic goal functions
成果类型:
Article
署名作者:
Makkeh, Abdullah; Graetz, Marcel; Schneider, Andreas C.; Ehrlich, David A.; Priesemann, Viola; Wibral, Michael
署名单位:
Fundacao Champalimaud; University of Gottingen; Max Planck Society; Swiss Federal Institutes of Technology Domain; ETH Zurich; University of Gottingen; University of Gottingen
刊物名称:
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA
ISSN/ISSBN:
0027-14568
DOI:
10.1073/pnas.2408125122
发表日期:
2025-03-05
关键词:
circuit
摘要:
Despite the impressive performance of biological and artificial networks, an intuitive understanding of how their local learning dynamics contribute to network-level task solutions remains a challenge to this date. Efforts to bring learning to a more local scale indeed lead to valuable insights, however, a general constructive approach to describe local learning goals that is both interpretable and adaptable across diverse tasks is still missing. We have previously formulated a local information processing goal that is highly adaptable and interpretable for a model neuron with compartmental structure. derive a corresponding parametric local learning rule, which allows us to introduce infomorphic neural networks. We demonstrate the versatility of these networks to perform tasks from supervised, unsupervised, and memory learning. By leveraging the tool to advance our understanding of the intricate structure of local learning.