Skip to yearly menu bar Skip to main content


Active Learning for Single Neuron Models with Lipschitz Non-Linearities

Aarshvi Gajjar · Christopher Musco · Chinmay Hegde

Auditorium 1 Foyer 28


We consider the problem of active learning for single neuron models, also sometimes called “ridge functions”, in the agnostic setting (under adversarial label noise). Such models have been shown to be broadly effective in modeling physical phenomena, and for constructing surrogate data-driven models for partial differential equations.Surprisingly, we show that for a single neuron model with any Lipschitz non-linearity (such as the ReLU, sigmoid, absolute value, low-degree polynomial, among others), strong provable approximation guarantees can be obtained using a well-known active learning strategy for fitting \emph{linear functions} in the agnostic setting. Namely, we can collect samples via statistical leverage score sampling, which has been shown to be near-optimal in other active learning scenarios. We support our theoretical results with empirical simulations showing that our proposed active learning strategy based on leverage score sampling outperforms (ordinary) uniform sampling when fitting single neuron models.

Live content is unavailable. Log in and register to view live content