Skip to yearly menu bar Skip to main content


Poster

Mirrorless Mirror Descent: A Natural Derivation of Mirror Descent

Suriya Gunasekar · Blake Woodworth · Nathan Srebro

Keywords: [ Learning Theory and Statistics ] [ Gradient-Based Optimization ]


Abstract: We present a direct (primal only) derivation of Mirror Descent as a ``partial'' discretization of gradient flow on a Riemannian manifold where the metric tensor is the Hessian of the Mirror Descent potential function. We contrast this discretization to Natural Gradient Descent, which is obtained by a ``full'' forward Euler discretization. This view helps shed light on the relationship between the methods and allows generalizing Mirror Descent to any Riemannian geometry in $\mathbb{R}^d$, even when the metric tensor is {\em not} a Hessian, and thus there is no ``dual.''

Chat is not available.