Beyond Graph Convolutional Network: An Interpretable Regularizer-Centered Optimization Framework
Abstract
Graph convolutional networks (GCNs) have been attracting widespread attentions due to their encouraging performance and powerful generalizations. However, few work provide a general view to interpret various GCNs and guide GCNs' designs. In this paper, by revisiting the original GCN, we induce an interpretable regularizer-centerd optimization framework, in which by building appropriate regularizers we can interpret most GCNs, such as APPNP, JKNet, DAGNN, and GNN-LF/HF. Further, under the proposed framework, we devise a dual-regularizer graph convolutional network (dubbed tsGCN) to capture topological and semantic structures from graph data. Since the derived learning rule for tsGCN contains an inverse of a large matrix and thus is time-consuming, we leverage the Woodbury matrix identity and low-rank approximation tricks to successfully decrease the high computational complexity of computing infinite-order graph convolutions. Extensive experiments on eight public datasets demonstrate that tsGCN achieves superior performance against quite a few state-of-the-art competitors w.r.t. classification tasks.
Cite
Text
Wang et al. "Beyond Graph Convolutional Network: An Interpretable Regularizer-Centered Optimization Framework." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I4.25593Markdown
[Wang et al. "Beyond Graph Convolutional Network: An Interpretable Regularizer-Centered Optimization Framework." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/wang2023aaai-beyond/) doi:10.1609/AAAI.V37I4.25593BibTeX
@inproceedings{wang2023aaai-beyond,
title = {{Beyond Graph Convolutional Network: An Interpretable Regularizer-Centered Optimization Framework}},
author = {Wang, Shiping and Wu, Zhihao and Chen, Yuhong and Chen, Yong},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {4693-4701},
doi = {10.1609/AAAI.V37I4.25593},
url = {https://mlanthology.org/aaai/2023/wang2023aaai-beyond/}
}