Weighted least-squares approximation with determinantal point processes and generalized volume sampling - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2023

Weighted least-squares approximation with determinantal point processes and generalized volume sampling

Résumé

We consider the problem of approximating a function from $L^2$ by an element of a given $m$-dimensional space, associated with some feature map $\varphi$, using evaluations of the function at random points $x_1,…,x_n$. After recalling some results on optimal weighted least-squares using independent and identically distributed points, we consider weighted least-squares using projection determinantal point processes (DPP) or volume sampling. These distributions introduce dependence between the points that promotes diversity in the selected features $\varphi(x_i)$. We first provide a generalized version of volume-rescaled sampling yielding quasi-optimality results in expectation with a number of samples $n=O(mlog(m))$, that means that the expected $L^2$ error is bounded by a constant times the best approximation error in $L^2$. Also, further assuming that the function is in some normed vector space $H$ continuously embedded in $L^2$, we further prove that the approximation is almost surely bounded by the best approximation error measured in the $H$-norm. This includes the cases of functions from $L^\infty$ or reproducing kernel Hilbert spaces. Finally, we present an alternative strategy consisting in using independent repetitions of projection DPP (or volume sampling), yielding similar error bounds as with i.i.d. or volume sampling, but in practice with a much lower number of samples. Numerical experiments illustrate the performance of the different strategies.

Dates et versions

hal-04385090 , version 1 (10-01-2024)

Identifiants

Citer

Anthony Nouy, Bertrand Michel. Weighted least-squares approximation with determinantal point processes and generalized volume sampling. 2023. ⟨hal-04385090⟩
31 Consultations
0 Téléchargements

Altmetric

Partager

More