NS Pillai, AM Stuart and AH Thiery
Optimal Scaling and Diffusion Limits for the Langevin Algorithm in High Dimensions
Abstract: The Metropolis-adjusted Langevin (MALA) algorithm is a sampling algorithm which makes local moves by incorporating information about the gradient of the target density. In this paper we study the efficiency of MALA on a natural class of target measures supported on an infinite dimensional Hilbert space. These natural measures have density with respect to a Gaussian random field measure and arise in many applications such as Bayesian nonparametric statistics and the theory of conditioned diffusions. We prove that, started at stationarity, a suitably interpolated and scaled version of the Markov chain corresponding to MALA converges to an infinite dimensional diffusion process. Our results imply that, in stationarity, the MALA algorithm applied to an N-dimensional approximation of the target will take O(N^1/3) steps to explore the invariant measure. As a by-product of the diffusion limit it also follows that the MALA algorithm is optimized at an average acceptance probability of 0.574. Until now such results were proved only for targets which are products of one dimensional distributions, or for variants of this situation. Our result is the first derivation of scaling limits for the MALA algorithm to target measures which are not of the product form. As a conse-quence the rescaled MALA algorithm converges weakly to an infinite dimensional Hilbert space valued diffusion, and not to a scalar diffusion.The diffusion limit is proved by showing that a drift-martingale decomposition of the Markov chain, suitably scaled, closely resembles an Euler-Maruyama discretization of the putative limit. An invariance principle is proved for the Martingale and a continuous mapping argument is used to complete the proof.