Is Learning Summary Statistics Necessary for Likelihood-Free Inference?

Abstract

Likelihood-free inference (LFI) is a set of techniques for inference in implicit statistical models. A longstanding question in LFI has been how to design or learn good summary statistics of data, but this might now seem unnecessary due to the advent of recent end-to-end (i.e. neural network-based) LFI methods. In this work, we rethink this question with a new method for learning summary statistics. We show that learning sufficient statistics may be easier than direct posterior inference, as the former problem can be reduced to a set of low-dimensional, easy-to-solve learning problems. This suggests us to explicitly decouple summary statistics learning from posterior inference in LFI. Experiments on diverse inference tasks with different data types validate our hypothesis.

Cite

Text

Chen et al. "Is Learning Summary Statistics Necessary for Likelihood-Free Inference?." International Conference on Machine Learning, 2023.

Markdown

[Chen et al. "Is Learning Summary Statistics Necessary for Likelihood-Free Inference?." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/chen2023icml-learning-c/)

BibTeX

@inproceedings{chen2023icml-learning-c,
  title     = {{Is Learning Summary Statistics Necessary for Likelihood-Free Inference?}},
  author    = {Chen, Yanzhi and Gutmann, Michael U. and Weller, Adrian},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {4529-4544},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/chen2023icml-learning-c/}
}