Stochastic Extragradient with Flip-Flop Shuffling & Anchoring: Provable Improvements

Abstract

In minimax optimization, the extragradient (EG) method has been extensively studied because it outperforms the gradient descent-ascent method in convex-concave (C-C) problems. Yet, stochastic EG (SEG) has seen limited success in C-C problems, especially for unconstrained cases. Motivated by the recent progress of shuffling-based stochastic methods, we investigate the convergence of shuffling-based SEG in unconstrained finite-sum minimax problems, in search of convergent shuffling-based SEG. Our analysis reveals that both random reshuffling and the recently proposed flip-flop shuffling alone can suffer divergence in C-C problems. However, with an additional simple trick called anchoring, we develop the SEG with flip-flop anchoring (SEG-FFA) method which successfully converges in C-C problems. We also show upper and lower bounds in the strongly-convex-strongly-concave setting, demonstrating that SEG-FFA has a provably faster convergence rate compared to other shuffling-based methods.

Cite

Text

Chae et al. "Stochastic Extragradient with Flip-Flop Shuffling & Anchoring: Provable Improvements." Neural Information Processing Systems, 2024. doi:10.52202/079017-3932

Markdown

[Chae et al. "Stochastic Extragradient with Flip-Flop Shuffling & Anchoring: Provable Improvements." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/chae2024neurips-stochastic/) doi:10.52202/079017-3932

BibTeX

@inproceedings{chae2024neurips-stochastic,
  title     = {{Stochastic Extragradient with Flip-Flop Shuffling & Anchoring: Provable Improvements}},
  author    = {Chae, Jiseok and Yun, Chulhee and Kim, Donghwan},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-3932},
  url       = {https://mlanthology.org/neurips/2024/chae2024neurips-stochastic/}
}