Sequential construction and dimension reduction of Gaussian processes under constraints - Archive ouverte HAL
Article Dans Une Revue SIAM Journal on Mathematics of Data Science Année : 2022

Sequential construction and dimension reduction of Gaussian processes under constraints

Résumé

Accounting for inequality constraints, such as boundedness, monotonicity or convexity, is challenging when modeling costly-to-evaluate black box functions. In this regard, finite-dimensional Gaussian process (GP) regression models bring a valuable solution, as they guarantee that the inequality constraints are satisfied everywhere. Nevertheless, these models are currently restricted to small dimensional situations (up to dimension 5). Addressing this issue, we introduce the MaxMod algorithm that sequentially inserts one-dimensional knots or adds active variables, thereby performing at the same time dimension reduction and efficient knot allocation. We prove the convergence of this algorithm. In intermediary steps of the proof, we propose the notion of multi-affine extension and study its properties. We also prove the convergence of finite-dimensional GPs, when the knots are not dense in the input space, extending the recent literature. With simulated and real data, we demonstrate that the MaxMod algorithm remains efficient in higher dimension (at least in dimension 20), and needs fewer knots than other constrained GP models from the state-of-the-art, to reach a given approximation error.
Fichier principal
Vignette du fichier
LOPEZ-LOPERA_Andres_JMDS_2022.pdf (1.06 Mo) Télécharger le fichier
Origine Fichiers éditeurs autorisés sur une archive ouverte
Licence

Dates et versions

hal-03614303 , version 1 (29-10-2024)

Licence

Identifiants

Citer

François Bachoc, Andrés F. López-Lopera, Olivier Roustant. Sequential construction and dimension reduction of Gaussian processes under constraints. SIAM Journal on Mathematics of Data Science, 2022, 4 (2), pp.772-800. ⟨10.1137/21M1407513⟩. ⟨hal-03614303⟩
152 Consultations
0 Téléchargements

Altmetric

Partager

More