Communication Dans Un Congrès Année : 2023

Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition

Résumé

Stein Variational Gradient Descent (SVGD) is an important alternative to the Langevin-type algorithms for sampling from probability distributions of the form $π(x) \propto \exp(-V(x))$. In the existing theory of Langevin-type algorithms and SVGD, the potential function $V$ is often assumed to be $L$-smooth. However, this restrictive condition excludes a large class of potential functions such as polynomials of degree greater than $2$. Our paper studies the convergence of the SVGD algorithm for distributions with $(L_0,L_1)$-smooth potentials. This relaxed smoothness assumption was introduced by Zhang et al. [2019a] for the analysis of gradient clipping algorithms. With the help of trajectory-independent auxiliary conditions, we provide a descent lemma establishing that the algorithm decreases the $\mathrm{KL}$ divergence at each iteration and prove a complexity bound for SVGD in the population limit in terms of the Stein Fisher information.

Fichier non déposé

Dates et versions

hal-04901622 , version 1 (20-01-2025)

Identifiants

Citer

Lukang Sun, Avetik Karagulyan, Peter Richtarik. Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition. The 26th International Conference on Artificial Intelligence and Statistics, 2023, Valencia (Espagne), Spain. ⟨10.48550/arXiv.2206.00508⟩. ⟨hal-04901622⟩
27 Consultations
0 Téléchargements

Altmetric

Partager

  • More