Skip to yearly menu bar Skip to main content


Poster

Information-theoretic Analysis of Bayesian Test Data Sensitivity

Futoshi Futami · Tomoharu Iwata

MR1 & MR2 - Number 98
[ ]
[ Poster
Fri 3 May 8 a.m. PDT — 8:30 a.m. PDT

Abstract:

Bayesian inference is often used to quantify uncertainty. Several recent analyses have rigorously decomposed uncertainty in prediction by Bayesian inference into two types: the inherent randomness in the data generation process and the variability due to lack of data respectively. Existing studies have analyzed these uncertainties from an information-theoretic perspective, assuming the model is well-specified and treating the model parameters as latent variables. However, such information-theoretic uncertainty analysis fails to account for a widely believed property of uncertainty known as sensitivity between test and training data. This means that if the test data is similar to the training data in some sense, the uncertainty will be smaller. In this study, we study such sensitivity using a new decomposition of uncertainty. Our analysis successfully defines such sensitivity using information-theoretic quantities. Furthermore, we extend the existing analysis of Bayesian meta-learning and show the novel sensitivities among tasks for the first time.

Chat is not available.