Skip to yearly menu bar Skip to main content


Poster

Diagonalisation SGD: Fast \& Convergent SGD for Non-Differentiable Models via Reparameterisation and Smoothing

Dominik Wagner · Basim Khajwal · Luke Ong

MR1 & MR2 - Number 164
[ ]
Thu 2 May 8 a.m. PDT — 8:30 a.m. PDT

Abstract:

It is well-known that the reparameterisation gradient estimator, which exhibits low variance in practice, is biased for non-differentiable models. This may compromise correctness of gradient-based optimisation methods such as stochastic gradient descent (SGD). We introduce a simple syntactic framework to define non-differentiable functions piecewisely and present a systematic approach to obtain smoothings for which the reparameterisation gradient estimator is unbiased. Our main contribution is a novel variant of SGD, Diagonalisation Stochastic Gradient Descent, which progressively enhances the accuracy of the smoothed approximation during optimisation, and we prove convergence to stationary points of the unsmoothed (original) objective. Our empirical evaluation reveals benefits over the state of the art: our approach is simple, fast, stable and attains orders of magnitude reduction in work-normalised variance.

Live content is unavailable. Log in and register to view live content