Poster
Accelerated Methods for Riemannian Min-Max Optimization Ensuring Bounded Geometric Penalties
David Martínez-Rubio · Markus Frey · Shpresim Sadiku
[
Abstract
]
Abstract:
In this work, we study optimization problems of the form minxmaxyf(x,y)minxmaxyf(x,y), where f(x,y)f(x,y) is defined on a product Riemannian manifold M×NM×N and is μxμx-strongly geodesically convex (g-convex) in xx and μyμy-strongly g-concave in yy, for μx,μy≥0μx,μy≥0. We design accelerated methods when ff is (Lx,Ly,Lxy)(Lx,Ly,Lxy)-smooth and MM, NN are Hadamard. To that aim we introduce new g-convex optimization results, of independent interest: we show global linear convergence for metric-projected Riemannian gradient descent and improve existing accelerated methods by reducing geometric constants. Additionally, we complete the analysis of two previous works applying to the Riemannian min-max case by removing an assumption about iterates staying in a pre-specified compact set.
Live content is unavailable. Log in and register to view live content