Skip to yearly menu bar Skip to main content


Poster

Shadow Manifold Hamiltonian Monte Carlo

Chris van der Heide · Fred Roosta · Liam Hodgkinson · Dirk Kroese

Keywords: [ Algorithms, Optimization and Computation Methods ] [ Monte Carlo Methods ]


Abstract:

Hamiltonian Monte Carlo and its descendants have found success in machine learning and computational statistics due to their ability to draw samples in high dimensions with greater efficiency than classical MCMC. One of these derivatives, Riemannian manifold Hamiltonian Monte Carlo (RMHMC), better adapts the sampler to the geometry of the target density, allowing for improved performances in sampling problems with complex geometric features. Other approaches have boosted acceptance rates by sampling from an integrator-dependent “shadow density” and compensating for the induced bias via importance sampling. We combine the benefits of RMHMC with those attained by sampling from the shadow density, by deriving the shadow Hamiltonian corresponding to the generalized leapfrog integrator used in RMHMC. This leads to a new algorithm, shadow manifold Hamiltonian Monte Carlo, that shows improved performance over RMHMC, and leaves the target density invariant.

Chat is not available.