Fishr: Invariant Gradient Variances for Out-of-Distribution Generalization - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Fishr: Invariant Gradient Variances for Out-of-Distribution Generalization

Résumé

Learning robust models that generalize well under changes in the data distribution is critical for realworld applications. To this end, there has been a growing surge of interest to learn simultaneously from multiple training domains-while enforcing different types of invariance across those domains. Yet, all existing approaches fail to show systematic benefits under controlled evaluation protocols. In this paper, we introduce a new regularization-named Fishr-that enforces domain invariance in the space of the gradients of the loss: specifically, the domain-level variances of gradients are matched across training domains. Our approach is based on the close relations between the gradient covariance, the Fisher Information and the Hessian of the loss: in particular, we show that Fishr eventually aligns the domain-level loss landscapes locally around the final weights. Extensive experiments demonstrate the effectiveness of Fishr for out-of-distribution generalization. Notably, Fishr improves the state of the art on the DomainBed benchmark and performs consistently better than Empirical Risk Minimization. Our code is available at https: //github.com/alexrame/fishr.
Fichier principal
Vignette du fichier
2109.02934.pdf (2.44 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03951454 , version 1 (23-01-2023)

Identifiants

  • HAL Id : hal-03951454 , version 1

Citer

Alexandre Ramé, Corentin Dancette, Matthieu Cord. Fishr: Invariant Gradient Variances for Out-of-Distribution Generalization. ICML, Jul 2022, Baltimore, United States. ⟨hal-03951454⟩
8 Consultations
8 Téléchargements

Partager

Gmail Facebook X LinkedIn More