Skip to yearly menu bar Skip to main content


Poster

Riemannian Accelerated Gradient Methods via Extrapolation

Andi Han · Bamdev Mishra · Pratik Jawanpuria · Junbin Gao

Auditorium 1 Foyer 123

Abstract:

In this paper, we propose a convergence acceleration scheme for general Riemannian optimization problems by extrapolating iterates on manifolds. We show that when the iterates are generated from the Riemannian gradient descent method, the scheme achieves the optimal convergence rate asymptotically and is computationally more favorable than the recently proposed Riemannian Nesterov accelerated gradient methods. A salient feature of our analysis is the convergence guarantees with respect to the use of general retraction and vector transport. Empirically, we verify the practical benefits of the proposed acceleration strategy, including robustness to the choice of different averaging schemes on manifolds.

Live content is unavailable. Log in and register to view live content