Phase diagram of Stochastic Gradient Descent in high-dimensional two-layer neural networks - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2022

Phase diagram of Stochastic Gradient Descent in high-dimensional two-layer neural networks

Résumé

Despite the non-convex optimization landscape, over-parametrized shallow networks are able to achieve global convergence under gradient descent. The picture can be radically different for narrow networks, which tend to get stuck in badly-generalizing local minima. Here we investigate the cross-over between these two regimes in the high-dimensional setting, and in particular investigate the connection between the so-called mean-field/hydrodynamic regime and the seminal approach of Saad & Solla. Focusing on the case of Gaussian data, we study the interplay between the learning rate, the time scale, and the number of hidden units in the high-dimensional dynamics of stochastic gradient descent (SGD). Our work builds on a deterministic description of SGD in high-dimensions from statistical physics, which we extend and for which we provide rigorous convergence rates.
Fichier principal
Vignette du fichier
2202.00293.pdf (1.85 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04026190 , version 1 (13-03-2023)

Identifiants

Citer

Rodrigo Veiga, Ludovic Stephan, Bruno Loureiro, Florent Krzakala, Lenka Zdeborová. Phase diagram of Stochastic Gradient Descent in high-dimensional two-layer neural networks. 2022. ⟨hal-04026190⟩
29 Consultations
38 Téléchargements

Altmetric

Partager

More