Machine learning without a processor: Emergent learning in a nonlinear analog network
成果类型:
Article
署名作者:
Dillavou, Sam; Beyer, Benjamin D.; Stern, Menachem; Liu, Andrea J.; Miskin, Marc Z.; Durian, Douglas J.
署名单位:
University of Pennsylvania; Simons Foundation; Flatiron Institute; University of Pennsylvania
刊物名称:
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA
ISSN/ISSBN:
0027-8843
DOI:
10.1073/pnas.2319718121
发表日期:
2024-07-09
关键词:
neural-networks
memristor
摘要:
Standard deep learning algorithms require differentiating large nonlinear networks, a process that is slow and power-hungry. Electronic contrastive local learning networks (CLLNs) offer potentially fast, efficient, and fault-tolerant hardware for analog machine learning, but existing implementations are linear, severely limiting their capabilities. These systems differ significantly from artificial neural networks as well as the brain, so the feasibility and utility of incorporating nonlinear elements have not been explored. Here, we introduce a nonlinear CLLN-an analog electronic network made of selfadjusting nonlinear resistive elements based on transistors. We demonstrate that the system learns tasks unachievable in linear systems, including XOR (exclusive or) and nonlinear regression, without a computer. We find our decentralized system reduces modes of training error in order (mean, slope, curvature), similar to spectral bias in artificial neural networks. The circuitry is robust to damage, retrainable in seconds, and performs learned tasks in microseconds while dissipating only picojoules of energy across each transistor. This suggests enormous potential for fast, low-power computing in edge systems like sensors, robotic controllers, and medical devices, as well as manufacturability at scale for performing and studying emergent learning.
来源URL: