Skip to yearly menu bar Skip to main content


Stochastic Methods in Variational Inequalities: Ergodicity, Bias and Refinements

Emmanouil Vasileios Vlatakis-Gkaragkounis · Angeliki Giannou · Yudong Chen · Qiaomin Xie

MR1 & MR2 - Number 135
[ ]
Sat 4 May 6 a.m. PDT — 8:30 a.m. PDT
Oral presentation: Oral: Optimization
Thu 2 May 5 a.m. PDT — 6:15 a.m. PDT


For min-max optimization and variational inequalities problems (VIPs), Stochastic Extragradient (SEG) and Stochastic Gradient Descent Ascent (SGDA) have emerged as preeminent algorithms. Constant step-size versions of SEG/SGDA have gained popularity due to several appealing benefits, but their convergence behaviors are complicated even in rudimentary bilinear models. Our work elucidates the probabilistic behavior of these algorithms and their projected variants, for a wide range of monotone and non-monotone VIPs with potentially biased stochastic oracles. By recasting them as time-homogeneous Markov Chains, we establish geometric convergence to a unique invariant distribution and aymptotic normality. Specializing to min-max optimization, we characterize the relationship between the step-size and the induced bias with respect to the global solution, which in turns allows for bias refinement via the Richardson-Romberg scheme. Our theoretical analysis is corroborated by numerical experiments.

Live content is unavailable. Log in and register to view live content