An Inertial Newton Algorithm for Deep Learning - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2021

An Inertial Newton Algorithm for Deep Learning

Résumé

We introduce a new second-order inertial optimization method for machine learning called INNA. It exploits the geometry of the loss function while only requiring stochastic approximations of the function values and the generalized gradients. This makes INNA fully implementable and adapted to large-scale optimization problems such as the training of deep neural networks. The algorithm combines both gradient-descent and Newton-like behaviors as well as inertia. We prove the convergence of INNA for most deep learning problems. To do so, we provide a well-suited framework to analyze deep learning loss functions involving tame optimization in which we study a continuous dynamical system together with its discrete stochastic approximations. We prove sublinear convergence for the continuous-time differential inclusion which underlies our algorithm. Additionally, we also show how standard optimization mini-batch methods applied to non-smooth non-convex problems can yield a certain type of spurious stationary points never discussed before. We address this issue by providing a theoretical framework around the new idea of $D$-criticality; we then give a simple asymptotic analysis of INNA. Our algorithm allows for using an aggressive learning rate of $o(1/\log k)$. From an empirical viewpoint, we show that INNA returns competitive results with respect to state of the art (stochastic gradient descent, ADAGRAD, ADAM) on popular deep learning benchmark problems.
Fichier principal
Vignette du fichier
1905.12278.pdf (1.72 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02140748 , version 1 (27-05-2019)
hal-02140748 , version 2 (06-06-2019)
hal-02140748 , version 3 (12-12-2019)
hal-02140748 , version 4 (12-10-2020)
hal-02140748 , version 5 (02-07-2021)
hal-02140748 , version 6 (20-08-2021)

Identifiants

Citer

Camille Castera, Jérôme Bolte, Cédric Févotte, Edouard Pauwels. An Inertial Newton Algorithm for Deep Learning. 2021. ⟨hal-02140748v5⟩
966 Consultations
417 Téléchargements

Altmetric

Partager

More