Structured Conditional Continuous Normalizing Flows for Efficient Amortized Inference in Graphical Models
Abstract
We exploit minimally faithful inversion of graphical model structures to specify sparse continuous normalizing flows (CNFs) for amortized inference. We find that the sparsity of this factorization can be exploited to reduce the numbers of parameters in the neural network, adaptive integration steps of the flow, and consequently FLOPs at both training and inference time without decreasing performance in comparison to unconstrained flows. By expressing the structure inversion as a compilation pass in a probabilistic programming language, we are able to apply it in a novel way to models as complex as convolutional neural networks. Furthermore, we extend the training objective for CNFs in the context of inference amortization to the symmetric Kullback-Leibler divergence, and demonstrate its theoretical and practical advantages.
Cite
Text
Weilbach et al. "Structured Conditional Continuous Normalizing Flows for Efficient Amortized Inference in Graphical Models." Artificial Intelligence and Statistics, 2020.Markdown
[Weilbach et al. "Structured Conditional Continuous Normalizing Flows for Efficient Amortized Inference in Graphical Models." Artificial Intelligence and Statistics, 2020.](https://mlanthology.org/aistats/2020/weilbach2020aistats-structured/)BibTeX
@inproceedings{weilbach2020aistats-structured,
title = {{Structured Conditional Continuous Normalizing Flows for Efficient Amortized Inference in Graphical Models}},
author = {Weilbach, Christian and Beronov, Boyan and Wood, Frank and Harvey, William},
booktitle = {Artificial Intelligence and Statistics},
year = {2020},
pages = {4441-4451},
volume = {108},
url = {https://mlanthology.org/aistats/2020/weilbach2020aistats-structured/}
}