Solving stochastic weak Minty variational inequalities without increasing batch size - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Solving stochastic weak Minty variational inequalities without increasing batch size

Résumé

This paper introduces a family of stochastic extragradient-type algorithms for a class of nonconvex-nonconcave problems characterized by the weak Minty variational inequality (MVI). Unlike existing results on extragradient methods in the monotone setting, employing diminishing stepsizes is no longer possible in the weak MVI setting. This has led to approaches such as increasing batch sizes per iteration which can however be prohibitively expensive. In contrast, our proposed methods involves two stepsizes and only requires one additional oracle evaluation per iteration. We show that it is possible to keep one fixed stepsize while it is only the second stepsize that is taken to be diminishing, making it interesting even in the monotone setting. Almost sure convergence is established and we provide a unified analysis for this family of schemes which contains a nonlinear generalization of the celebrated primal dual hybrid gradient algorithm.

Dates et versions

hal-04075832 , version 1 (20-04-2023)

Identifiants

Citer

Thomas Pethick, Olivier Fercoq, Puya Latafat, Panagiotis Patrinos, Volkan Cevher. Solving stochastic weak Minty variational inequalities without increasing batch size. International Conference on Learning Representations ICLR, 2023, Kigali, Rwanda. ⟨hal-04075832⟩
22 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More