Provable Smoothness Guarantees for Black-Box Variational Inference
Abstract
Black-box variational inference tries to approximate a complex target distribution through a gradient-based optimization of the parameters of a simpler distribution. Provable convergence guarantees require structural properties of the objective. This paper shows that for location-scale family approximations, if the target is M-Lipschitz smooth, then so is the “energy” part of the variational objective. The key proof idea is to describe gradients in a certain inner-product space, thus permitting the use of Bessel’s inequality. This result gives bounds on the location of the optimal parameters, and is a key ingredient for convergence guarantees.
Cite
Text
Domke. "Provable Smoothness Guarantees for Black-Box Variational Inference." International Conference on Machine Learning, 2020.Markdown
[Domke. "Provable Smoothness Guarantees for Black-Box Variational Inference." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/domke2020icml-provable/)BibTeX
@inproceedings{domke2020icml-provable,
title = {{Provable Smoothness Guarantees for Black-Box Variational Inference}},
author = {Domke, Justin},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {2587-2596},
volume = {119},
url = {https://mlanthology.org/icml/2020/domke2020icml-provable/}
}