Backpropagation-free Training of Deep Physical Neural Networks - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Science Année : 2023

Backpropagation-free Training of Deep Physical Neural Networks

Ali Momeni
  • Fonction : Auteur
Babak Rahmani
  • Fonction : Auteur
Matthieu Malléjac
  • Fonction : Auteur
Romain Fleury
  • Fonction : Auteur
  • PersonId : 1185373

Résumé

Recent years have witnessed the outstanding success of deep learning in various fields such as vision and natural language processing. This success is largely indebted to the massive size of deep learning models that is expected to increase unceasingly. This growth of the deep learning models is accompanied by issues related to their considerable energy consumption, both during the training and inference phases, as well as their scalability. Although a number of work based on unconventional physical systems have been proposed which addresses the issue of energy efficiency in the inference phase, efficient training of deep learning models has remained unaddressed. So far, training of digital deep learning models mainly relies on backpropagation, which is not suitable for physical implementation as it requires perfect knowledge of the computation performed in the so-called forward pass of the neural network. Here, we tackle this issue by proposing a simple deep neural network architecture augmented by a biologically plausible learning algorithm, referred to as "model-free forward-forward training". The proposed architecture enables training deep physical neural networks consisting of layers of physical nonlinear systems, without requiring detailed knowledge of the nonlinear physical layers' properties. We show that our method outperforms state-of-the-art hardware-aware training methods by improving training speed, decreasing digital computations, and reducing power consumption in physical systems. We demonstrate the adaptability of the proposed method, even in systems exposed to dynamic or unpredictable external perturbations. To showcase the universality of our approach, we train diverse wave-based physical neural networks that vary in the underlying wave phenomenon and the type of non-linearity they use, to perform vowel and image classification tasks experimentally.
Fichier principal
Vignette du fichier
2304.11042.pdf (40.3 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04297362 , version 1 (21-11-2023)

Identifiants

Citer

Ali Momeni, Babak Rahmani, Matthieu Malléjac, Philipp del Hougne, Romain Fleury. Backpropagation-free Training of Deep Physical Neural Networks. Science, 2023, ⟨10.1126/science.adi8474⟩. ⟨hal-04297362⟩
34 Consultations
5 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More