Abstract:
Conformal inference is a fundamental and versatile tool that provides distribution-free guarantees. We consider the transductive setting where decisions are made for a test sample of $m$ new points, giving rise to a family of $m$ conformal $p$-values. While classical results only concern their marginal distribution, this paper shows that their joint distribution can be described with a P\'olya urn model, which entails a concentration inequality for their empirical distribution function. These results hold for arbitrary exchangeable scores, including some adaptive ones that can use the covariates of the test sample. We demonstrate the usefulness of these general theoretical results by providing uniform guarantees for two machine learning tasks of current interest: interval prediction for transductive transfer learning and novelty detection based on two-class classification.
Chat is not available.