Transformer-Style Relational Reasoning with Dynamic Memory Updating for Temporal Network Modeling

Abstract

Network modeling aims to learn the latent representations of nodes such that the representations preserve both network structures and node attribute information. This problem is fundamental due to its prevalence in numerous domains. However, existing approaches either target the static networks or struggle to capture the complicated temporal dependency, while most real-world networks evolve over time and the success of network modeling hinges on the understanding of how entities are temporally connected. In this paper, we present TRRN, a transformer-style relational reasoning network with dynamic memory updating, to deal with the above challenges. TRRN employs multi-head self-attention to reason over a set of memories, which provides a multitude of shortcut paths for information to flow from past observations to the current latent representations. By utilizing the policy networks augmented with differentiable binary routers, TRRN estimates the possibility of each memory being activated and dynamically updates the memories at the time steps when they are most relevant. We evaluate TRRN with the tasks of node classification and link prediction on four real temporal network datasets. Experimental results demonstrate the consistent performance gains for TRRN over the leading competitors.

Cite

Text

Xu et al. "Transformer-Style Relational Reasoning with Dynamic Memory Updating for Temporal Network Modeling." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I5.16583

Markdown

[Xu et al. "Transformer-Style Relational Reasoning with Dynamic Memory Updating for Temporal Network Modeling." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/xu2021aaai-transformer/) doi:10.1609/AAAI.V35I5.16583

BibTeX

@inproceedings{xu2021aaai-transformer,
  title     = {{Transformer-Style Relational Reasoning with Dynamic Memory Updating for Temporal Network Modeling}},
  author    = {Xu, Dongkuan and Liang, Junjie and Cheng, Wei and Wei, Hua and Chen, Haifeng and Zhang, Xiang},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {4546-4554},
  doi       = {10.1609/AAAI.V35I5.16583},
  url       = {https://mlanthology.org/aaai/2021/xu2021aaai-transformer/}
}