Path-Gradient Estimators for Continuous Normalizing Flows

Abstract

Recent work has established a path-gradient estimator for simple variational Gaussian distributions and has argued that the path-gradient is particularly beneficial in the regime in which the variational distribution approaches the exact target distribution. In many applications, this regime can however not be reached by a simple Gaussian variational distribution. In this work, we overcome this crucial limitation by proposing a path-gradient estimator for the considerably more expressive variational family of continuous normalizing flows. We outline an efficient algorithm to calculate this estimator and establish its superior performance empirically.

Cite

Text

Vaitl et al. "Path-Gradient Estimators for Continuous Normalizing Flows." International Conference on Machine Learning, 2022.

Markdown

[Vaitl et al. "Path-Gradient Estimators for Continuous Normalizing Flows." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/vaitl2022icml-pathgradient/)

BibTeX

@inproceedings{vaitl2022icml-pathgradient,
  title     = {{Path-Gradient Estimators for Continuous Normalizing Flows}},
  author    = {Vaitl, Lorenz and Nicoli, Kim Andrea and Nakajima, Shinichi and Kessel, Pan},
  booktitle = {International Conference on Machine Learning},
  year      = {2022},
  pages     = {21945-21959},
  volume    = {162},
  url       = {https://mlanthology.org/icml/2022/vaitl2022icml-pathgradient/}
}