Scalable Spike-and-Slab
Abstract
Spike-and-slab priors are commonly used for Bayesian variable selection, due to their interpretability and favorable statistical properties. However, existing samplers for spike-and-slab posteriors incur prohibitive computational costs when the number of variables is large. In this article, we propose Scalable Spike-and-Slab (S^3), a scalable Gibbs sampling implementation for high-dimensional Bayesian regression with the continuous spike-and-slab prior of George & McCulloch (1993). For a dataset with n observations and p covariates, S^3 has order max{n^2 p_t, np} computational cost at iteration t where p_t never exceeds the number of covariates switching spike-and-slab states between iterations t and t-1 of the Markov chain. This improves upon the order n^2 p per-iteration cost of state-of-the-art implementations as, typically, p_t is substantially smaller than p. We apply S^3 on synthetic and real-world datasets, demonstrating orders of magnitude speed-ups over existing exact samplers and significant gains in inferential quality over approximate samplers with comparable cost.
Cite
Text
Biswas et al. "Scalable Spike-and-Slab." International Conference on Machine Learning, 2022.Markdown
[Biswas et al. "Scalable Spike-and-Slab." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/biswas2022icml-scalable/)BibTeX
@inproceedings{biswas2022icml-scalable,
title = {{Scalable Spike-and-Slab}},
author = {Biswas, Niloy and Mackey, Lester and Meng, Xiao-Li},
booktitle = {International Conference on Machine Learning},
year = {2022},
pages = {2021-2040},
volume = {162},
url = {https://mlanthology.org/icml/2022/biswas2022icml-scalable/}
}