Skip to yearly menu bar Skip to main content


Poster

Predictive variational Bayesian inference as risk-seeking optimization

Futoshi Futami · Tomoharu Iwata · Naonori Ueda · Issei Sato · Masashi Sugiyama


Abstract:

Since the Bayesian inference works poorly under model misspecification, various solutions have been explored to counteract the shortcomings. Recently proposed predictive Bayes (PB) that directly optimizes the Kullback Leibler divergence between the empirical distribution and the approximate predictive distribution shows excellent performances not only under model misspecification but also for over-parametrized models. However, its behavior and superiority are still unclear, which limits the applications of PB.Specifically, the superiority of PB has been shown only in terms of the predictive test log-likelihood and the performancein the sense of parameter estimation has not been investigated yet.Also, it is not clear why PB is superior with misspecified and over-parameterized models. In this paper, we clarify these ambiguities by studying PB in the framework of risk-seeking optimization. To achieve this, first, we provide a consistency theory for PB and then present intuition of robustness of PB to model misspecification using a response function theory. Thereafter, we theoretically and numerically show that PB has an implicit regularization effect that leads to flat local minima in over-parametrized models.

Chat is not available.