Skip to yearly menu bar Skip to main content


Poster

Large deviations rates for stochastic gradient descent with strongly convex functions

Dragana Bajovic · Dusan Jakovetic · Soummya Kar

Auditorium 1 Foyer 74

Abstract:

Recent works have shown that high probability metrics with stochastic gradient descent (SGD) exhibit informativeness and in some cases advantage over the commonly adopted mean-square error-based ones. In this work we provide a formal framework for the study of general high probability bounds with SGD, based on the theory of large deviations. The framework allows for a generic (not-necessarily bounded) gradient noise satisfying mild technical assumptions, including the dependence of the noise distribution on the current iterate. Under the preceding assumptions, we find an upper large deviation bound for SGD with strongly convex functions. The corresponding rate function captures analytical dependence on the noise distribution and other problem parameters. This is in contrast with conventional mean-square error analysis that captures only the noise dependence through the variance and does not capture the effect of higher order moments or distribution skew. We also derive exact large deviation rates for the case when the objective function is quadratic and show that the obtained function matches the one from the general upper bound hence showing the tightness of the general upper bound. Numerical examples illustrate and corroborate theoretical findings.

Live content is unavailable. Log in and register to view live content