Projected Stein Variational Newton: A Fast and Scalable Bayesian Inference Method in High Dimensions
Abstract
We propose a projected Stein variational Newton (pSVN) method for high-dimensional Bayesian inference. To address the curse of dimensionality, we exploit the intrinsic low-dimensional geometric structure of the posterior distribution in the high-dimensional parameter space via its Hessian (of the log posterior) operator and perform a parallel update of the parameter samples projected into a low-dimensional subspace by an SVN method. The subspace is adaptively constructed using the eigenvectors of the averaged Hessian at the current samples. We demonstrate fast convergence of the proposed method, complexity independent of the parameter and sample dimensions, and parallel scalability.
Cite
Text
Chen et al. "Projected Stein Variational Newton: A Fast and Scalable Bayesian Inference Method in High Dimensions." Neural Information Processing Systems, 2019.Markdown
[Chen et al. "Projected Stein Variational Newton: A Fast and Scalable Bayesian Inference Method in High Dimensions." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/chen2019neurips-projected/)BibTeX
@inproceedings{chen2019neurips-projected,
title = {{Projected Stein Variational Newton: A Fast and Scalable Bayesian Inference Method in High Dimensions}},
author = {Chen, Peng and Wu, Keyi and Chen, Joshua and O'Leary-Roseberry, Tom and Ghattas, Omar},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {15130-15139},
url = {https://mlanthology.org/neurips/2019/chen2019neurips-projected/}
}