A Sharp Discrepancy Bound for Jittered Sampling
Résumé
For $m, d \in \N$, a jittered sampling point set $P$ having $N = m^d$ points in $[0,1)^d$ is constructed by partitioning the unit cube $[0,1)^d$ into $m^d$ axis-aligned cubes of equal size and then placing one point independently and uniformly at random in each cube. We show that there are constants $c \ge 0$ and $C$ such that for all $d$ and all $m \ge d$ the expected non-normalized star discrepancy of a jittered sampling point set satisfies \[c \,dm^{\frac{d-1}{2}} \sqrt{1 + \log(\tfrac md)} \le \E D^*(P) \le C\, dm^{\frac{d-1}{2}} \sqrt{1 + \log(\tfrac md)}.\]
This discrepancy is thus smaller by a factor of $\Theta\big(\sqrt{\frac{1+\log(m/d)}{m/d}}\,\big)$ than the one of a uniformly distributed random point set of $m^d$ points. This result improves both the upper and the lower bound for the discrepancy of jittered sampling given by Pausinger and Steinerberger (Journal of Complexity~(2016)). It also removes the asymptotic requirement that $m$ is sufficiently large compared to $d$.
Domaines
Informatique [cs]Origine | Fichiers produits par l'(les) auteur(s) |
---|