Improving Gradient-Guided Nested Sampling for Posterior Inference

Abstract

We present a performant, general-purpose gradient-guided nested sampling (GGNS) algorithm, combining the state of the art in differentiable programming, Hamiltonian slice sampling, clustering, mode separation, dynamic nested sampling, and parallelization. This unique combination allows GGNS to scale well with dimensionality and perform competitively on a variety of synthetic and real-world problems. We also show the potential of combining nested sampling with generative flow networks to obtain large amounts of high-quality samples from the posterior distribution. This combination leads to faster mode discovery and more accurate estimates of the partition function.

Cite

Text

Lemos et al. "Improving Gradient-Guided Nested Sampling for Posterior Inference." International Conference on Machine Learning, 2024.

Markdown

[Lemos et al. "Improving Gradient-Guided Nested Sampling for Posterior Inference." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/lemos2024icml-improving/)

BibTeX

@inproceedings{lemos2024icml-improving,
  title     = {{Improving Gradient-Guided Nested Sampling for Posterior Inference}},
  author    = {Lemos, Pablo and Malkin, Nikolay and Handley, Will and Bengio, Yoshua and Hezaveh, Yashar and Perreault-Levasseur, Laurence},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {27230-27253},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/lemos2024icml-improving/}
}