SignSVRG: fixing signSGD via variance reduction - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

SignSVRG: fixing signSGD via variance reduction

Résumé

We consider the problem of unconstrained minimization of finite sums of functions. We propose a simple, yet, practical way to incorporate variance reduction techniques into SignSGD, guaranteeing convergence that is similar to the full sign gradient descent. The core idea is first instantiated on the problem of minimizing sums of convex and Lipschitz functions and is then extended to the smooth case via variance reduction. Our analysis is elementary and much simpler than the typical proof for variance reduction methods. We show that for smooth functions our method gives $\mathcal{O}(1 / \sqrt{T})$ rate for expected norm of the gradient and $\mathcal{O}(1/T)$ rate in the case of smooth convex functions, recovering convergence results of deterministic methods, while preserving computational advantages of SignSGD.

Dates et versions

hal-04112556 , version 1 (31-05-2023)

Identifiants

Citer

Evgenii Chzhen, Sholom Schechtman. SignSVRG: fixing signSGD via variance reduction. 2023. ⟨hal-04112556⟩
21 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More