Convergence of dynamical stationary fluctuations
Résumé
We present a general black box theorem that ensures convergence of a sequence of stationary Markov processes, provided a few assumptions are satisfied. This theorem relies on a control of the resolvents of the sequence of Markov processes, and on a suitable characterization of the resolvents of the limit. One major advantage of this approach is that it circumvents the use of the Boltzmann-Gibbs principle: in particular, we deduce in a rather simple way that the stationary fluctuations of the one-dimensional zero-range process converge to the stochastic heat equation. It also allows to establish results that were probably out of reach of existing methods: using the black box result, we are able to prove that the stationary fluctuations of a discrete model of ordered interfaces, that was considered previously in the statistical physics literature, converge to a system of reflected stochastic PDEs.
Domaines
Mathématiques [math]Origine | Fichiers produits par l'(les) auteur(s) |
---|