Importance sampling-based gradient method for dimension reduction in Poisson log-normal model - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2024

Importance sampling-based gradient method for dimension reduction in Poisson log-normal model

Résumé

High-dimensional count data poses significant challenges for statistical analysis, necessitating effective methods that also preserve explainability. We focus on a low rank constrained variant of the Poisson log-normal model, which relates the observed data to a latent low-dimensional multivariate Gaussian variable via a Poisson distribution. Variational inference methods have become a golden standard solution to infer such a model. While computationally efficient, they usually lack theoretical statistical properties with respect to the model. To address this issue we propose a projected stochastic gradient scheme that directly maximizes the log-likelihood. We prove the convergence of the proposed method when using importance sampling for estimating the gradient. Specifically, we obtain a rate of convergence of $O(T^{-1/2} + N^{-1})$ with $T$ the number of iterations and $N$ the number of Monte Carlo draws. The latter follows from a novel descent lemma for non convex $L$-smooth objective functions, and random biased gradient estimate. We also demonstrate numerically the efficiency of our solution compared to its variational competitor. Our method not only scales with respect to the number of observed samples but also provides access to the desirable properties of the maximum likelihood estimator.
Fichier principal
Vignette du fichier
main.pdf (837.99 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04712554 , version 1 (30-09-2024)

Licence

Identifiants

  • HAL Id : hal-04712554 , version 1

Citer

Bastien Batardière, Julien Chiquet, Joon Kwon, Julien Stoehr. Importance sampling-based gradient method for dimension reduction in Poisson log-normal model. 2024. ⟨hal-04712554⟩
0 Consultations
0 Téléchargements

Partager

More