MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing
Abstract
Existing popular methods for semi-supervised learning with Graph Neural Networks (such as the Graph Convolutional Network) provably cannot learn a general class of neighborhood mixing relationships. To address this weakness, we propose a new model, MixHop, that can learn these relationships, including difference operators, by repeatedly mixing feature representations of neighbors at various distances. MixHop requires no additional memory or computational complexity, and outperforms on challenging baselines. In addition, we propose sparsity regularization that allows us to visualize how the network prioritizes neighborhood information across different graph datasets. Our analysis of the learned architectures reveals that neighborhood mixing varies per datasets.
Cite
Text
Abu-El-Haija et al. "MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing." International Conference on Machine Learning, 2019.Markdown
[Abu-El-Haija et al. "MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/abuelhaija2019icml-mixhop/)BibTeX
@inproceedings{abuelhaija2019icml-mixhop,
title = {{MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing}},
author = {Abu-El-Haija, Sami and Perozzi, Bryan and Kapoor, Amol and Alipourfard, Nazanin and Lerman, Kristina and Harutyunyan, Hrayr and Steeg, Greg Ver and Galstyan, Aram},
booktitle = {International Conference on Machine Learning},
year = {2019},
pages = {21-29},
volume = {97},
url = {https://mlanthology.org/icml/2019/abuelhaija2019icml-mixhop/}
}