Skip to yearly menu bar Skip to main content


The Lie-Group Bayesian Learning Rule

Eren Mehmet Kiral · Thomas Moellenhoff · Khan Emtiyaz

Auditorium 1 Foyer 122


The Bayesian Learning Rule provides a framework for generic algorithm design but can be difficult to use for three reasons. First, it requires a specific parameterization of exponential family. Second, it uses gradients which can be difficult to compute. Third, its update may not always stay on the distribution's manifold. We address these difficulties by proposing an extension based on Lie-groups where posteriors are parametrized through transformations of an arbitrary base distribution and updated via the group's exponential map. This simplifies all three difficulties for many cases, providing flexible parametrizations through group's action, simple gradient computation through reparameterization, and updates that always stay on the manifold. We use the new learning rule to derive a new algorithm for deep learning with desirable biologically-plausible attributes to learn sparse features.

Live content is unavailable. Log in and register to view live content