Data-Efficient Variational Mutual Information Estimation via Bayesian Self-Consistency

Abstract

Mutual information (MI) is a central quantity of interest in information theory and machine learning, but estimating it accurately and efficiently remains challenging. In this paper, we propose a novel approach that exploits Bayesian self-consistency to improve the data efficiency of variational MI estimators. Our method incorporates a principled variance penalty that encourages consistency in marginal likelihood estimates, ultimately leading to more accurate MI estimation and posterior approximation with fewer gradient steps. We demonstrate the effectiveness of our method on two tasks: (1) MI estimation for correlated Gaussian distributions; and (2) Bayesian experimental design for the Michaelis-Menten model. Our results demonstrate that our self-consistent estimator converges faster whilst producing higher quality MI and posterior estimates compared to baselines.

Cite

Text

Ivanova et al. "Data-Efficient Variational Mutual Information Estimation via Bayesian Self-Consistency." NeurIPS 2024 Workshops: BDU, 2024.

Markdown

[Ivanova et al. "Data-Efficient Variational Mutual Information Estimation via Bayesian Self-Consistency." NeurIPS 2024 Workshops: BDU, 2024.](https://mlanthology.org/neuripsw/2024/ivanova2024neuripsw-dataefficient/)

BibTeX

@inproceedings{ivanova2024neuripsw-dataefficient,
  title     = {{Data-Efficient Variational Mutual Information Estimation via Bayesian Self-Consistency}},
  author    = {Ivanova, Desi R. and Schmitt, Marvin and Radev, Stefan T.},
  booktitle = {NeurIPS 2024 Workshops: BDU},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/ivanova2024neuripsw-dataefficient/}
}