Enhancing Sampling in Computational Statistics Using Modified Hamiltonians
The Hamiltonian Monte Carlo (HMC) method has been recognized as a powerful sampling tool in computational statistics. In this thesis, we show that performance of HMC can be dramatically improved by replacing Hamiltonians in the Metropolis test with modified Hamiltonians, and a complete momentum update with a partial momentum refreshment. The resulting generalized HMC importance sampler, which we called Mix & Match Hamiltonian Monte Carlo (MMHMC), arose as an extension of the Generalized Shadow Hybrid Monte Carlo (GSHMC) method, previously proposed for molecular simulation. The MMHMC method adapts GSHMC specifically to computational statistics and enriches it with new essential features: (i) the efficient algorithms for computation of modified Hamiltonians; (ii) the implicit momentum update procedure and (iii) the two-stage splitting integration schemes specially derived for the methods sampling with modified Hamiltonians. In addition, different optional strategies for momentum update and flipping are introduced as well as algorithms for adaptive tuning of parameters and efficient sampling of multimodal distributions are developed. MMHMC has been implemented in the in-house software package HaiCS (Hamiltonians in Computational Statistics) written in C, tested on the popular statistical models and compared in sampling efficiency with HMC, Generalized Hybrid Monte Carlo, Riemann Manifold Hamiltonian Monte Carlo, Metropolis Adjusted Langevin Algorithm and Random Walk Metropolis-Hastings. The analysis of time-normalized effective sample size reveals the superiority of MMHMC over popular sampling techniques, especially in solving high-dimensional problems.