Poster
An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization
Lesi Chen · Haishan Ye · Luo Luo
MR1 & MR2 - Number 178
Abstract:
This paper studies the stochastic nonconvex-strongly-concave minimax optimization over a multi-agent network. We propose an efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the ϵ-stationary points. Concretely, it requires O(min(κ3ϵ−3,κ2√Nϵ−2)) stochastic first-order oracle (SFO) calls and \tilde \mathcal O(\kappa^2 \epsilon^{-2}) communication rounds, where κ is the condition number and N is the total number of individual functions. Our numerical experiments also validate the superiority of DREAM over previous methods.
Chat is not available.