Graph Normalizing Flows
Abstract
We introduce graph normalizing flows: a new, reversible graph neural network model for prediction and generation. On supervised tasks, graph normalizing flows perform similarly to message passing neural networks, but at a significantly reduced memory footprint, allowing them to scale to larger graphs. In the unsupervised case, we combine graph normalizing flows with a novel graph auto-encoder to create a generative model of graph structures. Our model is permutation-invariant, generating entire graphs with a single feed-forward pass, and achieves competitive results with the state-of-the art auto-regressive models, while being better suited to parallel computing architectures.
Cite
Text
Liu et al. "Graph Normalizing Flows." Neural Information Processing Systems, 2019.Markdown
[Liu et al. "Graph Normalizing Flows." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/liu2019neurips-graph/)BibTeX
@inproceedings{liu2019neurips-graph,
title = {{Graph Normalizing Flows}},
author = {Liu, Jenny and Kumar, Aviral and Ba, Jimmy and Kiros, Jamie and Swersky, Kevin},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {13578-13588},
url = {https://mlanthology.org/neurips/2019/liu2019neurips-graph/}
}