Skip to yearly menu bar Skip to main content


Poster

Deep Generative Quantile Bayes

Veronika Rockova


Abstract:

This paper develops a multivariate Bayesian posterior sampling method through generative quantile learning.Our method learns a mapping that can transform (spherically) uniform random vectors into posterior samples without adversarial training.We utilize Monge-Kantorovich depth in multivariate quantiles to directly sample from Bayesian credible sets, a unique feature not offered by typical posterior sampling methods.To enhance training of the quantile mapping, we design a neural network that automatically performs summary statistic extraction.This additional neural network structure has performance benefits including support shrinkage (or posterior contraction) as the observation sample size increases. We demonstrate the usefulness of our approach on several examples where the absence of likelihood renders classical MCMC infeasible. Finally, we provide frequentist theoretical justifications for our quantile learning framework.

Live content is unavailable. Log in and register to view live content