Amortized Structural Variational Inference
Abstract
Variational inference (VI) is widely used for approximate Bayesian inference, but it can scale poorly and often requires re-optimization when new data arrive. Amortized variational inference (AVI) learns a global inference map, yet standard mean-field AVI can suffer from large variational and amortization gaps because of independence assumptions. We propose amortized structural variational inference (ASVI), which injects structural dependencies among latent variables through neural architectures that encode local neighborhood information. ASVI reduces both gaps while retaining scalability. Simulations and real-data experiments show that ASVI improves predictive accuracy and posterior fidelity over AVI, and matches structured VI at lower computational cost.