Backpropagation-free training of deep physical neural networks

成果类型:
Article
署名作者:
Momeni, Ali; Rahmani, Babak; Mallejac, Matthieu; del Hougne, Philipp; Fleury, Romain
署名单位:
Swiss Federal Institutes of Technology Domain; Ecole Polytechnique Federale de Lausanne; Microsoft; Microsoft United Kingdom; Centre National de la Recherche Scientifique (CNRS); CNRS - Institute for Engineering & Systems Sciences (INSIS); Universite de Rennes
刊物名称:
SCIENCE
ISSN/ISSBN:
0036-12276
DOI:
10.1126/science.adi8474
发表日期:
2023-12-15
页码:
1297-1303
关键词:
摘要:
Recent successes in deep learning for vision and natural language processing are attributed to larger models but come with energy consumption and scalability issues. Current training of digital deep-learning models primarily relies on backpropagation that is unsuitable for physical implementation. In this work, we propose a simple deep neural network architecture augmented by a physical local learning (PhyLL) algorithm, which enables supervised and unsupervised training of deep physical neural networks without detailed knowledge of the nonlinear physical layer's properties. We trained diverse wave-based physical neural networks in vowel and image classification experiments, showcasing the universality of our approach. Our method shows advantages over other hardware-aware training schemes by improving training speed, enhancing robustness, and reducing power consumption by eliminating the need for system modeling and thus decreasing digital computation.