Explicit Diffusion of Gaussian Mixture Model Based Image Priors - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

Explicit Diffusion of Gaussian Mixture Model Based Image Priors

Résumé

In this work we tackle the problem of estimating the density f_X of a random variable X by successive smoothing, such that the smoothed random variable Y fulfills (\partial _t - \varDelta _1)f_Y(\,\cdot \,, t) = 0 , f_Y(\,\cdot \,, 0) = f_X . With a focus on image processing, we propose a product/fields-of-experts model with Gaussian mixture experts that admits an analytic expression for f_Y (\,\cdot \,, t) under an orthogonality constraint on the filters. This construction naturally allows the model to be trained simultaneously over the entire diffusion horizon using empirical Bayes. We show preliminary results on image denoising where our model leads to competitive results while being tractable, interpretable, and having only a small number of learnable parameters. As a byproduct, our model can be used for reliable noise estimation, allowing blind denoising of images corrupted by heteroscedastic noise.

Dates et versions

hal-04239963 , version 1 (12-10-2023)

Identifiants

Citer

Martin Zach, Thomas Pock, Erich Kobler, Antonin Chambolle. Explicit Diffusion of Gaussian Mixture Model Based Image Priors. 9th International Conference on Scale Space and Variational Methods in Computer Vision, SSVM 2023, Università di Bologna, May 2023, Santa Margherita di Pula (Cagliari), Italy. pp.3-15, ⟨10.1007/978-3-031-31975-4_1⟩. ⟨hal-04239963⟩
42 Consultations
0 Téléchargements

Altmetric

Partager

More