Skip to yearly menu bar Skip to main content


Poster

Near-Polynomially Competitive Active Logistic Regression

Yihan Zhou · Trung Nguyen


Abstract: We address the problem of active logistic regression in the realizable setting. It is well known that active learning can require exponentially fewer label queries compared to passive learning, in some cases using log1ε rather than poly(1/ε) samples to get error ε larger than the optimum.We present the first algorithm that is polynomially competitive with the optimal algorithm on every input instance, up to factors polylogarithmic in the error and domain size. In particular, if any algorithm achieves sample complexity polylogarithmic in ε, so does ours. Our algorithm is based on efficient sampling and can be extended to learn more general class of functions. We further support our theoretical results with experiments demonstrating performance gains for logistic regression compared to existing active learning algorithms.

Live content is unavailable. Log in and register to view live content