Adaptive and Optimal Second-Order Optimistic Methods for Minimax Optimization
Abstract
We propose adaptive, line-search-free second-order methods with optimal rate of convergence for solving convex-concave min-max problems. By means of an adaptive step size, our algorithms feature a simple update rule that requires solving only one linear system per iteration, eliminating the need for line-search or backtracking mechanisms. Specifically, we base our algorithms on the optimistic method and appropriately combine it with second-order information. Moreover, distinct from common adaptive schemes, we define the step size recursively as a function of the gradient norm and the prediction error in the optimistic update. We first analyze a variant where the step size requires knowledge of the Lipschitz constant of the Hessian. Under the additional assumption of Lipschitz continuous gradients, we further design a parameter-free version by tracking the Hessian Lipschitz constant locally and ensuring the iterates remain bounded. We also evaluate the practical performance of our algorithm by comparing it to existing second-order algorithms for minimax optimization.
Cite
Text
Jiang et al. "Adaptive and Optimal Second-Order Optimistic Methods for Minimax Optimization." Neural Information Processing Systems, 2024. doi:10.52202/079017-2986Markdown
[Jiang et al. "Adaptive and Optimal Second-Order Optimistic Methods for Minimax Optimization." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/jiang2024neurips-adaptive-a/) doi:10.52202/079017-2986BibTeX
@inproceedings{jiang2024neurips-adaptive-a,
title = {{Adaptive and Optimal Second-Order Optimistic Methods for Minimax Optimization}},
author = {Jiang, Ruichen and Kavis, Ali and Jin, Qiujiang and Sanghavi, Sujay and Mokhtari, Aryan},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-2986},
url = {https://mlanthology.org/neurips/2024/jiang2024neurips-adaptive-a/}
}