Skip to yearly menu bar Skip to main content


Poster

Active Learning under Label Shift

Eric Zhao · Anqi Liu · Animashree Anandkumar · Yisong Yue

Keywords: [ Active Learning ] [ Learning Theory and Statistics ]


Abstract:

We address the problem of active learning under label shift: when the class proportions of source and target domains differ. We introduce a "medial distribution" to incorporate a tradeoff between importance weighting and class-balanced sampling and propose their combined usage in active learning. Our method is known as Mediated Active Learning under Label Shift (MALLS). It balances the bias from class-balanced sampling and the variance from importance weighting. We prove sample complexity and generalization guarantees for MALLS which show active learning reduces asymptotic sample complexity even under arbitrary label shift. We empirically demonstrate MALLS scales to high-dimensional datasets and can reduce the sample complexity of active learning by 60% in deep active learning tasks.

Chat is not available.