Random extrapolation for primal-dual coordinate descent - Archive ouverte HAL
Communication Dans Un Congrès Année : 2020

Random extrapolation for primal-dual coordinate descent

Résumé

We introduce a randomly extrapolated primal-dual coordinate descent method that adapts to sparsity of the data matrix and the favorable structures of the objective function. Our method updates only a subset of primal and dual variables with sparse data, and it uses large step sizes with dense data, retaining the benefits of the specific methods designed for each case. In addition to adapting to sparsity, our method attains fast convergence guarantees in favorable cases \textit{without any modifications}. In particular, we prove linear convergence under metric subregularity, which applies to strongly convex-strongly concave problems and piecewise linear quadratic functions. We show almost sure convergence of the sequence and optimal sublinear convergence rates for the primal-dual gap and objective values, in the general convex-concave case. Numerical evidence demonstrates the state-of-the-art empirical performance of our method in sparse and dense settings, matching and improving the existing methods.

Dates et versions

hal-02899036 , version 1 (14-07-2020)

Identifiants

Citer

Ahmet Alacaoglu, Olivier Fercoq, Volkan Cevher. Random extrapolation for primal-dual coordinate descent. International conference on machine learning, Jul 2020, online, France. ⟨hal-02899036⟩
48 Consultations
0 Téléchargements

Altmetric

Partager

More