Provably Scalable Black-Box Variational Inference with Structured Variational Families
Abstract
Variational families with full-rank covariance approximations are known not to work well in black-box variational inference (BBVI), both empirically and theoretically. In fact, recent computational complexity results for BBVI have established that full-rank variational families scale poorly with the dimensionality of the problem compared to e.g. mean-field families. This is particularly critical to hierarchical Bayesian models with local variables; their dimensionality increases with the size of the datasets. Consequently, one gets an iteration complexity with an explicit $\mathcal{O}(N^2)$ dependence on the dataset size $N$. In this paper, we explore a theoretical middle ground between mean-field variational families and full-rank families: structured variational families. We rigorously prove that certain scale matrix structures can achieve a better iteration complexity of $\mathcal{O}\left(N\right)$, implying better scaling with respect to $N$. We empirically verify our theoretical results on large-scale hierarchical models.
Cite
Text
Ko et al. "Provably Scalable Black-Box Variational Inference with Structured Variational Families." International Conference on Machine Learning, 2024.Markdown
[Ko et al. "Provably Scalable Black-Box Variational Inference with Structured Variational Families." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/ko2024icml-provably/)BibTeX
@inproceedings{ko2024icml-provably,
title = {{Provably Scalable Black-Box Variational Inference with Structured Variational Families}},
author = {Ko, Joohwan and Kim, Kyurae and Kim, Woo Chang and Gardner, Jacob R.},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {24896-24931},
volume = {235},
url = {https://mlanthology.org/icml/2024/ko2024icml-provably/}
}