Skip to yearly menu bar Skip to main content


Oral

Transductive conformal inference with adaptive scores

Ulysse Gazin · Gilles Blanchard · Etienne Roquain

[ ] [ Visit Oral: Statistics ]
Sat 4 May 2:30 a.m. — 3:30 a.m. PDT
[ Slides

Abstract: Conformal inference is a fundamental and versatile tool that provides distribution-free guarantees for many machine learning tasks. We consider the transductive setting, where decisions are made on a test sample of $m$ new points, giving rise to $m$ conformal $p$-values. While classical results only concern their marginal distribution, we show that their joint distribution follows a P\'olya urn model, and establish a concentration inequality for their empirical distribution function. The results hold for arbitrary exchangeable scores, including adaptive ones that can use the covariates of the test${+}$calibration samples at training stage for increased accuracy. We demonstrate the usefulness of these theoretical results through uniform, in-probability guarantees for two machine learning tasks of current interest: interval prediction for transductive transfer learning and novelty detection based on two-class classification.

Chat is not available.