Amortized implicit differentiation for stochastic bilevel optimization - Archive ouverte HAL Access content directly
Conference Papers Year : 2022

Amortized implicit differentiation for stochastic bilevel optimization

Abstract

We study a class of algorithms for solving bilevel optimization problems in both stochastic and deterministic settings when the inner-level objective is strongly convex. Specifically, we consider algorithms based on inexact implicit differentiation and we exploit a warm-start strategy to amortize the estimation of the exact gradient. We then introduce a unified theoretical framework inspired by the study of singularly perturbed systems (Habets, 1974) to analyze such amortized algorithms. By using this framework, our analysis shows these algorithms to match the computational complexity of oracle methods that have access to an unbiased estimate of the gradient, thus outperforming many existing results for bilevel optimization. We illustrate these findings on synthetic experiments and demonstrate the efficiency of these algorithms on hyper-parameter optimization experiments involving several thousands of variables.
Fichier principal
Vignette du fichier
main.pdf (1.22 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03455458 , version 1 (29-11-2021)

Identifiers

  • HAL Id : hal-03455458 , version 1

Cite

Michael Arbel, Julien Mairal. Amortized implicit differentiation for stochastic bilevel optimization. The Tenth International Conference on Learning Representations, Apr 2022, Online, France. ⟨hal-03455458⟩
124 View
83 Download

Share

Gmail Facebook X LinkedIn More