Automatic Posterior Transformation for Likelihood-Free Inference

Abstract

How can one perform Bayesian inference on stochastic simulators with intractable likelihoods? A recent approach is to learn the posterior from adaptively proposed simulations using neural network-based conditional density estimators. However, existing methods are limited to a narrow range of proposal distributions or require importance weighting that can limit performance in practice. Here we present automatic posterior transformation (APT), a new sequential neural posterior estimation method for simulation-based inference. APT can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators. It is more flexible, scalable and efficient than previous simulation-based inference techniques. APT can operate directly on high-dimensional time series and image data, opening up new applications for likelihood-free inference.

Cite

Text

Greenberg et al. "Automatic Posterior Transformation for Likelihood-Free Inference." International Conference on Machine Learning, 2019.

Markdown

[Greenberg et al. "Automatic Posterior Transformation for Likelihood-Free Inference." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/greenberg2019icml-automatic/)

BibTeX

@inproceedings{greenberg2019icml-automatic,
  title     = {{Automatic Posterior Transformation for Likelihood-Free Inference}},
  author    = {Greenberg, David and Nonnenmacher, Marcel and Macke, Jakob},
  booktitle = {International Conference on Machine Learning},
  year      = {2019},
  pages     = {2404-2414},
  volume    = {97},
  url       = {https://mlanthology.org/icml/2019/greenberg2019icml-automatic/}
}