Can Forward Gradient Match Backpropagation? - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Can Forward Gradient Match Backpropagation?

Louis Fournier
Stéphane Rivaud
Michael Eickenberg
  • Fonction : Auteur
  • PersonId : 1100887
Edouard Oyallon

Résumé

Forward Gradients - the idea of using directional derivatives in forward differentiation mode - have recently been shown to be utilizable for neural network training while avoiding problems generally associated with backpropagation gradient computation, such as locking and memorization requirements. The cost is the requirement to guess the step direction, which is hard in high dimensions. While current solutions rely on weighted averages over isotropic guess vector distributions, we propose to strongly bias our gradient guesses in directions that are much more promising, such as feedback obtained from small, local auxiliary networks. For a standard computer vision neural network, we conduct a rigorous study systematically covering a variety of combinations of gradient targets and gradient guesses, including those previously presented in the literature. We find that using gradients obtained from a local loss as a candidate direction drastically improves on random noise in Forward Gradient methods.
Fichier principal
Vignette du fichier
main.pdf (667.49 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04119829 , version 1 (06-06-2023)

Identifiants

Citer

Louis Fournier, Stéphane Rivaud, Eugene Belilovsky, Michael Eickenberg, Edouard Oyallon. Can Forward Gradient Match Backpropagation?. Fortieth International Conference on Machine Learning, Jul 2023, Honolulu (Hawaii), USA, United States. ⟨hal-04119829⟩
106 Consultations
122 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More