Heavy-tailed Streaming Statistical Estimation

Che-Ping Tsai · Adarsh Prasad · Sivaraman Balakrishnan · Pradeep Ravikumar

[ Abstract ]
[ Visit Poster at Spot D0 in Virtual World ]
Wed 30 Mar 8:30 a.m. PDT — 10 a.m. PDT
Oral presentation: Oral 11: Learning theory / Kernels
Wed 30 Mar 7 a.m. PDT — 8 a.m. PDT

Abstract: We consider the task of heavy-tailed statistical estimation given streaming $p$-dimensional samples. This could also be viewed as stochastic optimization under heavy-tailed distributions, with an additional $O(p)$ space complexity constraint. We design a clipped stochastic gradient descent algorithm and provide an improved analysis, under a more nuanced condition on the noise of the stochastic gradients, which we show is critical when analyzing stochastic optimization problems arising from general statistical estimation problems. Our results guarantee convergence not just in expectation but with exponential concentration, and moreover does so using $O(1)$ batch size. We provide consequences of our results for mean estimation and linear regression. Finally, we provide empirical corroboration of our results and algorithms via synthetic experiments for mean estimation and linear regression.

Chat is not available.