Skip to yearly menu bar Skip to main content


Poster

Model-agnostic out-of-distribution detection using combined statistical tests

Federico Bergamin · Pierre-Alexandre Mattei · Jakob Drachmann Havtorn · Hugo Sénétaire · Hugo Schmutz · Lars Maaløe · Soren Hauberg · Jes Frellsen


Abstract:

We present simple methods for out-of-distribution detection using a trained generative model. These techniques, based on classical statistical tests, are model-agnostic in the sense that they can be applied to any differentiable generative model. The idea is to combine a classical parametric test (Rao's score test) with the recently introduced typicality test. These two test statistics are both theoretically well-founded and exploit different sources of information based on the likelihood for the typicality test and its gradient for the score test. We show that combining them using Fisher's method overall leads to a more accurate out-of-distribution test. We also discuss the benefits of casting out-of-distribution detection as a statistical testing problem, noting in particular that false positive rate control can be valuable for practical out-of-distribution detection. Despite their simplicity and generality, these methods can be competitive with model-specific out-of-distribution detection algorithms without any assumptions on the out-distribution.

Chat is not available.