Edgeformers: Graph-Empowered Transformers for Representation Learning on Textual-Edge Networks
Abstract
Edges in many real-world social/information networks are associated with rich text information (e.g., user-user communications or user-product reviews). However, mainstream network representation learning models focus on propagating and aggregating node attributes, lacking specific designs to utilize text semantics on edges. While there exist edge-aware graph neural networks, they directly initialize edge attributes as a feature vector, which cannot fully capture the contextualized text semantics of edges. In this paper, we propose Edgeformers, a framework built upon graph-enhanced Transformers, to perform edge and node representation learning by modeling texts on edges in a contextualized way. Specifically, in edge representation learning, we inject network information into each Transformer layer when encoding edge texts; in node representation learning, we aggregate edge representations through an attention mechanism within each node’s ego-graph. On five public datasets from three different domains, Edgeformers consistently outperform state-of-the-art baselines in edge classification and link prediction, demonstrating the efficacy in learning edge and node representations, respectively.
Cite
Text
Jin et al. "Edgeformers: Graph-Empowered Transformers for Representation Learning on Textual-Edge Networks." International Conference on Learning Representations, 2023.Markdown
[Jin et al. "Edgeformers: Graph-Empowered Transformers for Representation Learning on Textual-Edge Networks." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/jin2023iclr-edgeformers/)BibTeX
@inproceedings{jin2023iclr-edgeformers,
title = {{Edgeformers: Graph-Empowered Transformers for Representation Learning on Textual-Edge Networks}},
author = {Jin, Bowen and Zhang, Yu and Meng, Yu and Han, Jiawei},
booktitle = {International Conference on Learning Representations},
year = {2023},
url = {https://mlanthology.org/iclr/2023/jin2023iclr-edgeformers/}
}