Neural Message Passing for Multi-Relational Ordered and Recursive Hypergraphs
Abstract
Message passing neural network (MPNN) has recently emerged as a successful framework by achieving state-of-the-art performances on many graph-based learning tasks. MPNN has also recently been extended to multi-relational graphs (each edge is labelled), and hypergraphs (each edge can connect any number of vertices). However, in real-world datasets involving text and knowledge, relationships are much more complex in which hyperedges can be multi-relational, recursive, and ordered. Such structures present several unique challenges because it is not clear how to adapt MPNN to variable-sized hyperedges in them. In this work, we first unify exisiting MPNNs on different structures into G-MPNN (Generalised MPNN) framework. Motivated by real-world datasets, we then propose a novel extension of the framework, MPNN-R (MPNN-Recursive) to handle recursively-structured data. Experimental results demonstrate the effectiveness of proposed G-MPNN and MPNN-R.
Cite
Text
Yadati. "Neural Message Passing for Multi-Relational Ordered and Recursive Hypergraphs." Neural Information Processing Systems, 2020.Markdown
[Yadati. "Neural Message Passing for Multi-Relational Ordered and Recursive Hypergraphs." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/yadati2020neurips-neural/)BibTeX
@inproceedings{yadati2020neurips-neural,
title = {{Neural Message Passing for Multi-Relational Ordered and Recursive Hypergraphs}},
author = {Yadati, Naganand},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/yadati2020neurips-neural/}
}