An Efficient Minibatch Acceptance Test for Metropolis-Hastings
Abstract
We present a novel Metropolis-Hastings method for large datasets that uses small expected-size mini-batches of data. Previous work on reducing the cost of Metropolis-Hastings tests yields only constant factor reductions versus using the full dataset for each sample. Here we present a method that can be tuned to provide arbitrarily small batch sizes, by adjusting either proposal step size or temperature. Our test uses the noise-tolerant Barker acceptance test with a novel additive correction variable. The resulting test has similar cost to a normal SGD update. Our experiments demonstrate several order-of-magnitude speedups over previous work.
Cite
Text
Seita et al. "An Efficient Minibatch Acceptance Test for Metropolis-Hastings." Conference on Uncertainty in Artificial Intelligence, 2017. doi:10.24963/ijcai.2018/753Markdown
[Seita et al. "An Efficient Minibatch Acceptance Test for Metropolis-Hastings." Conference on Uncertainty in Artificial Intelligence, 2017.](https://mlanthology.org/uai/2017/seita2017uai-efficient/) doi:10.24963/ijcai.2018/753BibTeX
@inproceedings{seita2017uai-efficient,
title = {{An Efficient Minibatch Acceptance Test for Metropolis-Hastings}},
author = {Seita, Daniel and Pan, Xinlei and Chen, Haoyu and Canny, John F.},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2017},
doi = {10.24963/ijcai.2018/753},
url = {https://mlanthology.org/uai/2017/seita2017uai-efficient/}
}