Skip to yearly menu bar Skip to main content


Poster

Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing?

Kyurae Kim · Yian Ma · Jacob Gardner

MR1 & MR2 - Number 144
[ ]
[ Poster
Thu 2 May 8 a.m. PDT — 8:30 a.m. PDT

Abstract: We prove that black-box variational inference (BBVI) with control variates, particularly the sticking-the-landing (STL) estimator, converges at a geometric (traditionally called “linear”) rate under perfect variational family specification. In particular, we prove a quadratic bound on the gradient variance of the STL estimator, one which encompassesmisspecified variational families. Combined with previous works on the quadratic variance condition, this directly implies convergence of BBVI with the use of projectedstochastic gradient descent. For the projection operator, we consider a domain with triangular scale matrices, which the projection onto is computable in $\theta(d)$ time, where $d$ is the dimensionality of the target posterior. We also improve existing analysis on the regular closed-form entropy gradient estimators, which enables comparison against the STLestimator, providing explicit non-asymptotic complexity guarantees for both.

Chat is not available.