Skip to yearly menu bar Skip to main content


Poster

Robust Approximate Sampling via Stochastic Gradient Barker Dynamics

Lorenzo Mauri · Giacomo Zanella

MR1 & MR2 - Number 173
[ ]
[ Poster
Thu 2 May 8 a.m. PDT — 8:30 a.m. PDT
 
Oral presentation: Oral: Probabilistic Methods
Thu 2 May 6:45 a.m. PDT — 8 a.m. PDT

Abstract:

Stochastic Gradient (SG) Markov Chain Monte Carlo algorithms (MCMC) are popular algorithms for Bayesian sampling in the presence of large datasets. However, they come with little theoretical guarantees and assessing their empirical performances is non-trivial. In such context, it is crucial to develop algorithms that are robust to the choice of hyperparameters and to gradients heterogeneity since, in practice, both the choice of step-size and behaviour of target gradients induce hard-to-control biases in the invariant distribution. In this work we introduce the stochastic gradient Barker dynamics algorithm (SGBD), extending the recently developed Barker MCMC scheme, a robust alternative to Langevin-based sampling algorithms, to the stochastic gradient framework. We characterize the impact of stochastic gradients on the Barker transition mechanism and develop a bias-corrected version that, under suitable assumptions, eliminates the error due to the gradient noise in the proposal. We illustrate the performance on a number of high-dimensional examples, showing that SGBD is more robust to hyperparameter tuning and to irregular behavior of the target gradients compared to the popular stochastic gradient Langevin dynamics algorithm.

Chat is not available.