On Maximum-a-Posteriori estimation with Plug & Play priors and stochastic gradient descent - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2021

On Maximum-a-Posteriori estimation with Plug & Play priors and stochastic gradient descent

Résumé

Bayesian methods to solve imaging inverse problems usually combine an explicit data likelihood function with a prior distribution that explicitly models expected properties of the solution. Many kinds of priors have been explored in the literature, from simple ones expressing local properties to more involved ones exploiting image redundancy at a non-local scale. In a departure from explicit modelling, several recent works have proposed and studied the use of implicit priors defined by an image denoising algorithm. This approach, commonly known as Plug & Play (PnP) regularisation, can deliver remarkably accurate results, particularly when combined with state-of-the-art denoisers based on convolutional neural networks. However, the theoretical analysis of PnP Bayesian models and algorithms is difficult and works on the topic often rely on unrealistic assumptions on the properties of the image denoiser. This papers studies maximum-a-posteriori (MAP) estimation for Bayesian models with PnP priors. We first consider questions related to existence, stability and well-posedness, and then present a convergence proof for MAP computation by PnP stochastic gradient descent (PnP-SGD) under realistic assumptions on the denoiser used. We report a range of imaging experiments demonstrating PnP-SGD as well as comparisons with other PnP schemes.
Fichier principal
Vignette du fichier
PNSGD.pdf (3.29 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03348735 , version 1 (19-09-2021)
hal-03348735 , version 2 (21-07-2022)
hal-03348735 , version 3 (15-12-2022)

Identifiants

Citer

Rémi Laumont, Valentin de Bortoli, Andrés Almansa, Julie Delon, Alain Durmus, et al.. On Maximum-a-Posteriori estimation with Plug & Play priors and stochastic gradient descent. 2021. ⟨hal-03348735v1⟩
482 Consultations
443 Téléchargements

Altmetric

Partager

More