On global convergence of ResNets: From finite to infinite width using linear parameterization - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

On global convergence of ResNets: From finite to infinite width using linear parameterization

Résumé

Overparametrization is a key factor in the absence of convexity to explain global convergence of gradient descent (GD) for neural networks. Beside the well studied lazy regime, infinite width (mean field) analysis has been developed for shallow networks, using on convex optimization technics. To bridge the gap between the lazy and mean field regimes, we study Residual Networks (ResNets) in which the residual block has linear parametrization while still being nonlinear. Such ResNets admit both infinite depth and width limits, encoding residual blocks in a Reproducing Kernel Hilbert Space (RKHS). In this limit, we prove a local Polyak-Lojasiewicz inequality. Thus, every critical point is a global minimizer and a local convergence result of GD holds, retrieving the lazy regime. In contrast with other mean-field studies, it applies to both parametric and non-parametric cases under an expressivity condition on the residuals. Our analysis leads to a practical and quantified recipe: starting from a universal RKHS, Random Fourier Features are applied to obtain a finite dimensional parameterization satisfying with high-probability our expressivity condition.
Fichier principal
Vignette du fichier
main.pdf (578.58 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03473699 , version 1 (09-12-2021)
hal-03473699 , version 2 (31-01-2023)

Identifiants

Citer

Raphaël Barboni, Gabriel Peyré, François-Xavier Vialard. On global convergence of ResNets: From finite to infinite width using linear parameterization. NeurIPS 2022, Nov 2022, New Orleans (online), United States. pp.16385-16397. ⟨hal-03473699v2⟩
142 Consultations
190 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More