Variance Reduction in Black-Box Variational Inference by Adaptive Importance Sampling

Abstract

Overdispersed black-box variational inference employs importance sampling to reduce the variance of the Monte Carlo gradient in black-box variational inference. A simple overdispersed proposal distribution is used. This paper aims to investigate how to adaptively obtain better proposal distribution for lower variance. To this end, we directly approximate the optimal proposal in theory using a Monte Carlo moment matching step at each variational iteration. We call this adaptive proposal moment matching proposal (MMP). Experimental results on two Bayesian models show that the MMP can effectively reduce variance in black-box learning, and perform better than baseline inference algorithms.

Cite

Text

Li et al. "Variance Reduction in Black-Box Variational Inference by Adaptive Importance Sampling." International Joint Conference on Artificial Intelligence, 2018. doi:10.24963/IJCAI.2018/333

Markdown

[Li et al. "Variance Reduction in Black-Box Variational Inference by Adaptive Importance Sampling." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/li2018ijcai-variance/) doi:10.24963/IJCAI.2018/333

BibTeX

@inproceedings{li2018ijcai-variance,
  title     = {{Variance Reduction in Black-Box Variational Inference by Adaptive Importance Sampling}},
  author    = {Li, Ximing and Li, Changchun and Chi, Jinjin and Ouyang, Jihong},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2018},
  pages     = {2404-2410},
  doi       = {10.24963/IJCAI.2018/333},
  url       = {https://mlanthology.org/ijcai/2018/li2018ijcai-variance/}
}