SHPOS: A Theoretical Guaranteed Accelerated Particle Optimization Sampling Method

Abstract

Recently, the Stochastic Particle Optimization Sampling (SPOS) method is proposed to solve the particle-collapsing pitfall of deterministic Particle Variational Inference methods by ultilizing the stochastic Overdamped Langevin dynamics to enhance exploration. In this paper, we propose an accelerated particle optimization sampling method called Stochastic Hamiltonian Particle Optimization Sampling (SHPOS). Compared to the first-order dynamics used in SPOS, SHPOS adopts an augmented second-order dynamics, which involves an extra momentum term to achieve acceleration. We establish a non-asymptotic convergence analysis for SHPOS, and show that it enjoys a faster convergence rate than SPOS. Besides, we also propose a variance-reduced stochastic gradient variant of SHPOS for tasks with large-scale datasets and complex models. Experiments on both synthetic and real data validate our theory and demonstrate the superiority of SHPOS over the state-of-the-art.

Cite

Text

Li et al. "SHPOS: A Theoretical Guaranteed Accelerated Particle Optimization Sampling Method." International Joint Conference on Artificial Intelligence, 2021. doi:10.24963/IJCAI.2021/372

Markdown

[Li et al. "SHPOS: A Theoretical Guaranteed Accelerated Particle Optimization Sampling Method." International Joint Conference on Artificial Intelligence, 2021.](https://mlanthology.org/ijcai/2021/li2021ijcai-shpos/) doi:10.24963/IJCAI.2021/372

BibTeX

@inproceedings{li2021ijcai-shpos,
  title     = {{SHPOS: A Theoretical Guaranteed Accelerated Particle Optimization Sampling Method}},
  author    = {Li, Zhijian and Zhang, Chao and Qian, Hui and Du, Xin and Peng, Lingwei},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {2701-2707},
  doi       = {10.24963/IJCAI.2021/372},
  url       = {https://mlanthology.org/ijcai/2021/li2021ijcai-shpos/}
}