Optimal and Adaptive Monteiro-Svaiter Acceleration

Abstract

We develop a variant of the Monteiro-Svaiter (MS) acceleration framework that removes the need to solve an expensive implicit equation at every iteration. Consequently, for any $p\ge 2$ we improve the complexity of convex optimization with Lipschitz $p$th derivative by a logarithmic factor, matching a lower bound. We also introduce an MS subproblem solver that requires no knowledge of problem parameters, and implement it as either a second- or first-order method by solving linear systems or applying MinRes, respectively. On logistic regression problems our method outperforms previous accelerated second-order methods, but under-performs Newton's method; simply iterating our first-order adaptive subproblem solver is competitive with L-BFGS.

Cite

Text

Carmon et al. "Optimal and Adaptive Monteiro-Svaiter Acceleration." Neural Information Processing Systems, 2022.

Markdown

[Carmon et al. "Optimal and Adaptive Monteiro-Svaiter Acceleration." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/carmon2022neurips-optimal/)

BibTeX

@inproceedings{carmon2022neurips-optimal,
  title     = {{Optimal and Adaptive Monteiro-Svaiter Acceleration}},
  author    = {Carmon, Yair and Hausler, Danielle and Jambulapati, Arun and Jin, Yujia and Sidford, Aaron},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/carmon2022neurips-optimal/}
}