Poster
THE SAMPLE COMPLEXITY OF ERM IN EUCLIDEAN STOCHASTIC CONVEX OPTIMIZATION
Daniel Carmon · Amir Yehudayoff · Roi Livni
MR1 & MR2 - Number 130
[
Abstract
]
Oral
presentation:
Oral: Optimization
Thu 2 May 5 a.m. PDT — 6:15 a.m. PDT
[
Poster]
Sat 4 May 6 a.m. PDT
— 8:30 a.m. PDT
Thu 2 May 5 a.m. PDT — 6:15 a.m. PDT
Abstract:
Stochastic convex optimization is a model for optimization in a stochastic environment that is also motivated by the theory of machine learning. Empirical risk minimization (ERM) is a fundamental algorithmic principle. We study the sample complexity of ERM in stochastic convex optimization in the standard Euclidean setup in d-dimensional space with accuracy parameter $\epsilon > 0$. A folklore example shows that the sample complexity is at least $\Omega(\frac{1}{\epsilon^2})$. Shalev-Shwartz, Shamir, Srebro and Sridharan proved that the complexity is at least $\Omega(\log(d))$ for $\epsilon = \frac{1}{2}$.Later on, Feldman proved that it is at least $\Omega(\frac{d}{\epsilon})$. Proving asymptotically stronger lower bounds was left as an open problem. We prove that, in fact, the sample complexity is at most $\tilde{O}(\frac{d}{\epsilon}+\frac{1}{\epsilon^2})$. The proof builds a mechanism for controlling stochastic convexoptimization problems.
Chat is not available.