SEA: Graph Shell Attention in Graph Neural Networks
Abstract
A common issue in Graph Neural Networks (GNNs) is known as over-smoothing. By increasing the number of iterations within the message-passing of GNNs, the nodes' representations of the input graph align with each other and become indiscernible. Recently, it has been shown that increasing a model's complexity by integrating an attention mechanism yields more expressive architectures. This is majorly contributed to steering the nodes' representations only towards nodes that are more informative than others. Transformer models in combination with GNNs result in architectures including Graph Transformer Layers (GTL), where layers are entirely based on the attention operation. However, the calculation of a node's representation is still restricted to the computational working flow of a GNN. In our work, we relax the GNN architecture by means of implementing a routing heuristic. Specifically, the nodes' representations are routed to dedicated experts. Each expert calculates the representations according to their respective GNN workflow. The definitions of distinguishable GNNs result from k-localized views starting from the central node. We call this procedure Graph Shell Attention (SEA), where experts process different subgraphs in a transformer-motivated fashion. Intuitively, by increasing the number of experts, the models gain in expressiveness such that a node's representation is solely based on nodes that are located within the receptive field of an expert. We evaluate our architecture on various benchmark datasets showing competitive results compared to state-of-the-art models.
Cite
Text
Frey et al. "SEA: Graph Shell Attention in Graph Neural Networks." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2022. doi:10.1007/978-3-031-26390-3_20Markdown
[Frey et al. "SEA: Graph Shell Attention in Graph Neural Networks." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2022.](https://mlanthology.org/ecmlpkdd/2022/frey2022ecmlpkdd-sea/) doi:10.1007/978-3-031-26390-3_20BibTeX
@inproceedings{frey2022ecmlpkdd-sea,
title = {{SEA: Graph Shell Attention in Graph Neural Networks}},
author = {Frey, Christian M. M. and Ma, Yunpu and Schubert, Matthias},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2022},
pages = {326-343},
doi = {10.1007/978-3-031-26390-3_20},
url = {https://mlanthology.org/ecmlpkdd/2022/frey2022ecmlpkdd-sea/}
}