Poster

Selective Classification via One-Sided Prediction

Aditya Gangrade · Anil Kag · Venkatesh Saligrama

Keywords: [ Classification ] [ Models and Methods ]

[ Abstract ]
Wed 14 Apr 12:45 p.m. PDT — 2:45 p.m. PDT

Abstract:

We propose a novel method for selective classification (SC), a problem which allows a classifier to abstain from predicting some instances, thus trading off accuracy against coverage (the fraction of instances predicted). In contrast to prior gating or confidence-set based work, our proposed method optimises a collection of class-wise decoupled one-sided empirical risks, and is in essence a method for explicitly finding the largest decision sets for each class that have few false positives. This one-sided prediction (OSP) based relaxation yields an SC scheme that attains near-optimal coverage in the practically relevant high target accuracy regime, and further admits efficient implementation, leading to a flexible and principled method for SC. We theoretically derive generalization bounds for SC and OSP, and empirically we show that our scheme strongly outperforms state of the art methods in coverage at small error levels.

Chat is not available.