Break the Ceiling: Stronger Multi-Scale Deep Graph Convolutional Networks
Abstract
Recently, neural network based approaches have achieved significant progress for solving large, complex, graph-structured problems. Nevertheless, the advantages of multi-scale information and deep architectures have not been sufficiently exploited. In this paper, we first analyze key factors constraining the expressive power of existing Graph Convolutional Networks (GCNs), including the activation function and shallow learning mechanisms. Then, we generalize spectral graph convolution and deep GCN in block Krylov subspace forms, upon which we devise two architectures, both scalable in depth however making use of multi-scale information differently. On several node classification tasks, the proposed architectures achieve state-of-the-art performance.
Cite
Text
Luan et al. "Break the Ceiling: Stronger Multi-Scale Deep Graph Convolutional Networks." Neural Information Processing Systems, 2019.Markdown
[Luan et al. "Break the Ceiling: Stronger Multi-Scale Deep Graph Convolutional Networks." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/luan2019neurips-break/)BibTeX
@inproceedings{luan2019neurips-break,
title = {{Break the Ceiling: Stronger Multi-Scale Deep Graph Convolutional Networks}},
author = {Luan, Sitao and Zhao, Mingde and Chang, Xiao-Wen and Precup, Doina},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {10945-10955},
url = {https://mlanthology.org/neurips/2019/luan2019neurips-break/}
}