A functional limit theorem for coin tossing Markov chains
Abstract
We prove a functional limit theorem for Markov chains that, in each step, move up or down by a possibly state dependent constant with probability 1/2, respectively. The theorem entails that the law of every one-dimensional regular continuous strong Markov process can be approximated with such Markov chains arbitrarily well. It applies, in particular, to Markov processes that cannot be characterized as solutions to stochastic differential equations. We illustrate the theorem with Markov processes exhibiting sticky features, e.g., sticky Brownian motion and a Brownian motion slowed down on the Cantor set.
Origin | Files produced by the author(s) |
---|
Loading...