Accelerated Stochastic Optimization Methods Under Quasar-Convexity

Abstract

Non-convex optimization plays a key role in a growing number of machine learning applications. This motivates the identification of specialized structure that enables sharper theoretical analysis. One such identified structure is quasar-convexity, a non-convex generalization of convexity that subsumes convex functions. Existing algorithms for minimizing quasar-convex functions in the stochastic setting have either high complexity or slow convergence, which prompts us to derive a new class of stochastic methods for optimizing smooth quasar-convex functions. We demonstrate that our algorithms have fast convergence and outperform existing algorithms on several examples, including the classical problem of learning linear dynamical systems. We also present a unified analysis of our newly proposed algorithms and a previously studied deterministic algorithm.

Cite

Text

Fu et al. "Accelerated Stochastic Optimization Methods Under Quasar-Convexity." International Conference on Machine Learning, 2023.

Markdown

[Fu et al. "Accelerated Stochastic Optimization Methods Under Quasar-Convexity." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/fu2023icml-accelerated/)

BibTeX

@inproceedings{fu2023icml-accelerated,
  title     = {{Accelerated Stochastic Optimization Methods Under Quasar-Convexity}},
  author    = {Fu, Qiang and Xu, Dongchu and Wilson, Ashia Camage},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {10431-10460},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/fu2023icml-accelerated/}
}