Solving Convex-Concave Problems with $\mathcal{O}(\epsilon^{-4/7})$ Second-Order Oracle Complexity
Abstract
Previous algorithms can solve convex-concave minimax problems $\min_{x \in \mathcal{X}} \max_{y \in \mathcal{Y}} f(x,y)$ with $\gO(\epsilon^{-2/3})$ second-order oracle calls using Newton-type methods. This result has been speculated to be optimal because the upper bound is achieved by a natural generalization of the optimal first-order method. In this work, we show an improved upper bound of $\tilde{\gO}(\epsilon^{-4/7})$ by generalizing the optimal second-order method for convex optimization to solve the convex-concave minimax problem. We further apply a similar technique to lazy Hessian algorithms and show that our proposed algorithm can also be seen as a second-order “Catalyst” framework (Lin et al., JMLR 2018) that could accelerate any globally convergent algorithms for solving minimax problems.
Cite
Text
Chen et al. "Solving Convex-Concave Problems with $\mathcal{O}(\epsilon^{-4/7})$ Second-Order Oracle Complexity." Proceedings of Thirty Eighth Conference on Learning Theory, 2025.Markdown
[Chen et al. "Solving Convex-Concave Problems with $\mathcal{O}(\epsilon^{-4/7})$ Second-Order Oracle Complexity." Proceedings of Thirty Eighth Conference on Learning Theory, 2025.](https://mlanthology.org/colt/2025/chen2025colt-solving/)BibTeX
@inproceedings{chen2025colt-solving,
title = {{Solving Convex-Concave Problems with $\mathcal{O}(\epsilon^{-4/7})$ Second-Order Oracle Complexity}},
author = {Chen, Lesi and Liu, Chengchang and Luo, Luo and Zhang, Jingzhao},
booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory},
year = {2025},
pages = {952-982},
volume = {291},
url = {https://mlanthology.org/colt/2025/chen2025colt-solving/}
}