Approximate Probabilistic Inference with Composed Flows
Abstract
We study the problem of probabilistic inference on the joint distribution defined by a normalizing flow model. Given a pre-trained flow model $p(\boldsymbol{x})$, we wish to estimate $p(\boldsymbol{x}_2 \mid \boldsymbol{x}_1)$ for some partitioning of the variables $\boldsymbol{x} = (\boldsymbol{x}_1, \boldsymbol{x}_2)$. We first show that this task is computationally hard for a large class of flow models. Motivated by this, we propose a framework for \textit{approximate} probabilistic inference. Specifically, our method trains a new flow model with the property that its composition with the given model approximates the target conditional distribution. We describe how we can train this new model using variational inference and handle conditioning under arbitrary differentiable transformations. Experimentally, our approach outperforms Langevin Dynamics in terms of sample quality, while requiring much fewer parameters and training time compared to regular variational inference. We further validate the flexibility of our method on a variety of inference tasks with applications to inverse problems.
Cite
Text
Whang et al. "Approximate Probabilistic Inference with Composed Flows." NeurIPS 2020 Workshops: Deep_Inverse, 2020.Markdown
[Whang et al. "Approximate Probabilistic Inference with Composed Flows." NeurIPS 2020 Workshops: Deep_Inverse, 2020.](https://mlanthology.org/neuripsw/2020/whang2020neuripsw-approximate/)BibTeX
@inproceedings{whang2020neuripsw-approximate,
title = {{Approximate Probabilistic Inference with Composed Flows}},
author = {Whang, Jay and Lindgren, Erik and Dimakis, Alex},
booktitle = {NeurIPS 2020 Workshops: Deep_Inverse},
year = {2020},
url = {https://mlanthology.org/neuripsw/2020/whang2020neuripsw-approximate/}
}