A self- learning magnetic Hopfield neural network with intrinsic gradient descent adaption
成果类型:
Article
署名作者:
Niu, Chang; Zhang, Huanyu; Xu, Chuanlong; Hu, Wenjie; Wu, Yunzhuo; Wu, Yu; Wang, Yadi; Wu, Tong; Zhu, Yi; Zhu, Yinyan; Wang, Wenbin; Wu, Yizheng; Yin, Lifeng; Xiao, Jiang; Yu, Weichao; Guo, Hangwen; Shen, Jian
署名单位:
Fudan University; Fudan University; Fudan University; Hefei National Laboratory; Collaborative Innovation Center of Advanced Microstructures (CICAM)
刊物名称:
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA
ISSN/ISSBN:
0027-11542
DOI:
10.1073/pnas.2416294121
发表日期:
2024-12-09
关键词:
摘要:
Physical neural networks (PNN) using physical materials and devices to mimic synapses and neurons offer an energy- efficient way to implement artificial neural networks. Yet, training PNN is difficult and heavily relies on external computing resources. An emerging concept to solve this issue is called physical self- learning that uses intrinsic physical parameters as trainable weights. Under external inputs (i.e., training data), training is achieved by the natural evolution of physical parameters that intrinsically adapt modern learning rules via an autonomous physical process, eliminating the requirements on external computation resources. Here, we demonstrate a real spintronic system that mimics Hopfield neural networks (HNN), and unsupervised learning is intrinsically performed via the evolution of the physical process. Using magnetic texture-defined conductance matrix as trainable weights, we illustrate that under external voltage inputs, the conductance matrix naturally evolves and adapts Oja's learning algorithm in a gradient descent manner. The self- learning HNN is scalable and can achieve associative memories on patterns with high similarities. The fast spin dynamics and reconfigurability of magnetic textures offer an advantageous platform toward efficient autonomous training directly in materials.