Sketch-and-Project Meets Newton Method: \\ Global $\mathcal O \left( K^{-2} \right)$ Convergence with Low-Rank Updates

Abstract

In this paper, we propose the first sketch-and-project Newton method with fast $\mathcal O \left( k^{-2} \right)$ global convergence rate while using low-rank updates. Our method, SGN, can be viewed in three ways: i) as a sketch-and-project algorithm projecting updates of Newton method, ii) as a cubically regularized Newton method in sketched subspaces, and iii) as a damped Newton method in sketched subspaces. SGN inherits best of all three worlds: cheap iteration costs of sketch-and-project methods (up to $\mathcal O(1)$), state-of-the-art $\mathcal O \left( k^{-2} \right)$ global convergence rate of full-rank Newton-like methods and the algorithm simplicity of damped Newton methods. Finally, we demonstrate its comparable empirical performance to baseline algorithms.

Cite

Text

Hanzely. "Sketch-and-Project Meets Newton Method: \\ Global $\mathcal O \left( K^{-2} \right)$ Convergence with Low-Rank Updates." ICML 2023 Workshops: FL, 2023.

Markdown

[Hanzely. "Sketch-and-Project Meets Newton Method: \\ Global $\mathcal O \left( K^{-2} \right)$ Convergence with Low-Rank Updates." ICML 2023 Workshops: FL, 2023.](https://mlanthology.org/icmlw/2023/hanzely2023icmlw-sketchandproject/)

BibTeX

@inproceedings{hanzely2023icmlw-sketchandproject,
  title     = {{Sketch-and-Project Meets Newton Method: \\ Global $\mathcal O \left( K^{-2} \right)$ Convergence with Low-Rank Updates}},
  author    = {Hanzely, Slavomir},
  booktitle = {ICML 2023 Workshops: FL},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/hanzely2023icmlw-sketchandproject/}
}