An Efficient Minibatch Acceptance Test for Metropolis-Hastings

Abstract

We present a novel Metropolis-Hastings method for large datasets that uses small expected-size mini-batches of data. Previous work on reducing the cost of Metropolis-Hastings tests yields only constant factor reductions versus using the full dataset for each sample. Here we present a method that can be tuned to provide arbitrarily small batch sizes, by adjusting either proposal step size or temperature. Our test uses the noise-tolerant Barker acceptance test with a novel additive correction variable. The resulting test has similar cost to a normal SGD update. Our experiments demonstrate several order-of-magnitude speedups over previous work.

Cite

Text

Seita et al. "An Efficient Minibatch Acceptance Test for Metropolis-Hastings." International Joint Conference on Artificial Intelligence, 2018. doi:10.24963/IJCAI.2018/753

Markdown

[Seita et al. "An Efficient Minibatch Acceptance Test for Metropolis-Hastings." International Joint Conference on Artificial Intelligence, 2018.](https://mlanthology.org/ijcai/2018/seita2018ijcai-efficient/) doi:10.24963/IJCAI.2018/753

BibTeX

@inproceedings{seita2018ijcai-efficient,
  title     = {{An Efficient Minibatch Acceptance Test for Metropolis-Hastings}},
  author    = {Seita, Daniel and Pan, Xinlei and Chen, Haoyu and Canny, John F.},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2018},
  pages     = {5359-5363},
  doi       = {10.24963/IJCAI.2018/753},
  url       = {https://mlanthology.org/ijcai/2018/seita2018ijcai-efficient/}
}