Law of large numbers and central limit theorem for wide two-layer neural networks: the mini-batch and noisy case - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

Law of large numbers and central limit theorem for wide two-layer neural networks: the mini-batch and noisy case

Arnaud Descours
  • Fonction : Auteur
  • PersonId : 1151874
Manon Michel
Boris Nectoux
  • Fonction : Auteur
  • PersonId : 1151875

Résumé

In this work, we consider a wide two-layer neural network and study the behavior of its empirical weights under a dynamics set by a stochastic gradient descent along the quadratic loss with mini-batches and noise. Our goal is to prove a trajectorial law of large number as well as a central limit theorem for their evolution. When the noise is scaling as 1/N β and 1/2 < β ≤ ∞, we rigorously derive and generalize the LLN obtained for example in [CRBVE20, MMM19, SS20b]. When 3/4 < β ≤ ∞, we also generalize the CLT (see also [SS20a]) and further exhibit the effect of mini-batching on the asymptotic variance which leads the fluctuations. The case β = 3/4 is trickier and we give an example showing the divergence with time of the variance thus establishing the instability of the predictions of the neural network in this case. It is illustrated by simple numerical examples.
Fichier principal
Vignette du fichier
LLN_CLT_arxiv.pdf (713.04 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03737557 , version 1 (25-07-2022)
hal-03737557 , version 2 (28-02-2023)

Identifiants

Citer

Arnaud Descours, Arnaud Guillin, Manon Michel, Boris Nectoux. Law of large numbers and central limit theorem for wide two-layer neural networks: the mini-batch and noisy case. 2023. ⟨hal-03737557v2⟩
108 Consultations
52 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More