Learning PSD-valued functions using kernel sums-of-squares - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2021

Learning PSD-valued functions using kernel sums-of-squares

Résumé

Shape constraints such as positive semi-definiteness (PSD) for matrices or convexity for functions play a central role in many applications in machine learning and sciences, including metric learning, optimal transport, and economics. Yet, very few function models exist that enforce PSD-ness or convexity with good empirical performance and theoretical guarantees. In this paper, we introduce a kernel sum-of-squares model for functions that take values in the PSD cone, which extends kernel sums-of-squares models that were recently proposed to encode non-negative scalar functions. We provide a representer theorem for this class of PSD functions, show that it constitutes a universal approximator of PSD functions, and derive eigenvalue bounds in the case of subsampled equality constraints. We then apply our results to modeling convex functions, by enforcing a kernel sum-of-squares representation of their Hessian, and show that any smooth and strongly convex function may be thus represented. Finally, we illustrate our methods on a PSD matrix-valued regression task, and on scalar-valued convex regression.
Fichier principal
Vignette du fichier
Learning_positive_operator_valued_functions (7).pdf (3.4 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03454277 , version 1 (29-11-2021)

Identifiants

Citer

Boris Muzellec, Francis Bach, Alessandro Rudi. Learning PSD-valued functions using kernel sums-of-squares. 2021. ⟨hal-03454277⟩
28 Consultations
78 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More