Improving Convergence Guarantees of Random Subspace Second-Order Algorithm for Nonconvex Optimization

Abstract

In recent years, random subspace methods have been actively studied for large-dimensional nonconvex problems. Recent subspace methods have improved theoretical guarantees such as iteration complexity and local convergence rate while reducing computational costs by deriving descent directions in randomly selected low-dimensional subspaces. This paper proposes the Random Subspace Homogenized Trust Region (RSHTR) method with the best theoretical guarantees among random subspace algorithms for nonconvex optimization. RSHTR achieves an $\varepsilon$-approximate first-order stationary point in $O(\varepsilon^{-3/2})$ iterations, converging locally at a linear rate. Furthermore, under rank-deficient conditions, RSHTR satisfies $\varepsilon$-approximate second-order necessary conditions in $O(\varepsilon^{-3/2})$ iterations and exhibits a local quadratic convergence. Experiments on real-world datasets verify the benefits of RSHTR.

Cite

Text

Higuchi et al. "Improving Convergence Guarantees of Random Subspace Second-Order Algorithm for Nonconvex Optimization." International Conference on Learning Representations, 2025.

Markdown

[Higuchi et al. "Improving Convergence Guarantees of Random Subspace Second-Order Algorithm for Nonconvex Optimization." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/higuchi2025iclr-improving/)

BibTeX

@inproceedings{higuchi2025iclr-improving,
  title     = {{Improving Convergence Guarantees of Random Subspace Second-Order Algorithm for Nonconvex Optimization}},
  author    = {Higuchi, Rei and Poirion, Pierre-Louis and Takeda, Akiko},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/higuchi2025iclr-improving/}
}