Boosting Adaptive Graph Augmented MLPs via Customized Knowledge Distillation
Abstract
While Graph Neural Networks (GNNs) have shown convinced performance on handling non-Euclidean network data, the high inference latency caused by message-passing mechanism hinders their deployment on real-time scenarios. One emerging inference acceleration approach is to distill knowledge derived from teacher GNNs into message-passing-free student multi-layer perceptrons (MLPs). Nevertheless, due to the graph heterophily causing performance degradation of teacher GNNs, as well as the unsatisfactory generalization ability of student MLPs on graph data, GNN-MLP like designs often achieve inferior performance. To tackle this challenge, we propose boosting adaptive GR aph A ugmented MLPs via C ustomized knowl E dge D istillation (GRACED), a novel approach to learn graph knowledge effectively and efficiently. Specifically, we first design a novel customized knowledge distillation strategy to modify the guided knowledge to mitigate the adverse influence of heterophily to student MLPs. Then, we introduce an adaptive graph propagation approach to precompute aggregation feature for node considering both of homophily and heterophily to boost the student MLPs for learning graph information. Furthermore, we design an aggregation feature approximation technique for inductive scenarios. Extensive experiments on node classification task and theoretical analyses demonstrate the superiority of GRACED by comparing with the state-of-the-art methods under both transductive and inductive settings across homophilic and heterophilic datasets.
Cite
Text
Wei et al. "Boosting Adaptive Graph Augmented MLPs via Customized Knowledge Distillation." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2023. doi:10.1007/978-3-031-43418-1_6Markdown
[Wei et al. "Boosting Adaptive Graph Augmented MLPs via Customized Knowledge Distillation." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2023.](https://mlanthology.org/ecmlpkdd/2023/wei2023ecmlpkdd-boosting/) doi:10.1007/978-3-031-43418-1_6BibTeX
@inproceedings{wei2023ecmlpkdd-boosting,
title = {{Boosting Adaptive Graph Augmented MLPs via Customized Knowledge Distillation}},
author = {Wei, Shaowei and Wu, Zhengwei and Zhang, Zhiqiang and Zhou, Jun},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2023},
pages = {87-103},
doi = {10.1007/978-3-031-43418-1_6},
url = {https://mlanthology.org/ecmlpkdd/2023/wei2023ecmlpkdd-boosting/}
}