Skip to yearly menu bar Skip to main content


Poster

Efficient Informed Proposals for Discrete Distributions via Newton’s Series Approximation

Yue Xiang · Dongyao Zhu · Bowen Lei · Dongkuan Xu · Ruqi Zhang

Auditorium 1 Foyer 143

Abstract:

Gradients have been exploited in recent studies in proposal distributions to accelerate the convergence of Markov chain Monte Carlo algorithms on discrete distributions. However, these methods require a natural differentiable extension of the target discrete distribution, which often does not exist or does not provide an effective guidance of the gradient. In this paper, we develop a gradient-like proposal for any discrete distribution without the requirement of a natural differentiable extension. Built upon a locally-balanced proposal, our method efficiently approximates the discrete likelihood ratio via a Newton's series expansion, as a discrete analog of the continuous Taylor series expansion, to enable a large and efficient exploration in discrete spaces. We show that our method can also be viewed as a multilinear extension, thus inheriting the desired properties. We prove that our method has a guaranteed convergence rate with or without the Metropolis-Hastings step. Furthermore, our method outperforms a number of popular alternatives in several different experiments, including the Ising model, the facility location problem, text summarization, and image retrieval.

Live content is unavailable. Log in and register to view live content