Skip to yearly menu bar Skip to main content


Poster

Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing?

Kyurae Kim · Yian Ma · Jacob Gardner

Multipurpose Room 2 - Number 144

Abstract: We prove that black-box variational inference (BBVI) with control variates, particularly the sticking-the-landing (STL) estimator, converges at a geometric (traditionally called “linear”) rate under perfect variational family specification. In particular, we prove a quadratic bound on the gradient variance of the STL estimator, one which encompassesmisspecified variational families. Combined with previous works on the quadratic variance condition, this directly implies convergence of BBVI with the use of projectedstochastic gradient descent. For the projection operator, we consider a domain with triangular scale matrices, which the projection onto is computable in $\theta(d)$ time, where $d$ is the dimensionality of the target posterior. We also improve existing analysis on the regular closed-form entropy gradient estimators, which enables comparison against the STLestimator, providing explicit non-asymptotic complexity guarantees for both.

Live content is unavailable. Log in and register to view live content