Pre-Trained Hypergraph Convolutional Neural Networks with Self-Supervised Learning

Abstract

Hypergraphs are powerful tools for modeling complex interactions across various domains, including biomedicine. However, learning meaningful node representations from hypergraphs remains a challenge. Existing supervised methods often lack generalizability, thereby limiting their real-world applications. We propose a new method, Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning (PhyGCN), which leverages hypergraph structure for self-supervision to enhance node representations. PhyGCN introduces a unique training strategy that integrates variable hyperedge sizes with self-supervised learning, enabling improved generalization to unseen data. Applications on multi-way chromatin interactions and polypharmacy side-effects demonstrate the effectiveness of PhyGCN. As a generic framework for high-order interaction datasets with abundant unlabeled data, PhyGCN holds strong potential for enhancing hypergraph node representations across various domains.

Cite

Text

Deng et al. "Pre-Trained Hypergraph Convolutional Neural Networks with Self-Supervised Learning." Transactions on Machine Learning Research, 2024.

Markdown

[Deng et al. "Pre-Trained Hypergraph Convolutional Neural Networks with Self-Supervised Learning." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/deng2024tmlr-pretrained/)

BibTeX

@article{deng2024tmlr-pretrained,
  title     = {{Pre-Trained Hypergraph Convolutional Neural Networks with Self-Supervised Learning}},
  author    = {Deng, Yihe and Zhang, Ruochi and Xu, Pan and Ma, Jian and Gu, Quanquan},
  journal   = {Transactions on Machine Learning Research},
  year      = {2024},
  url       = {https://mlanthology.org/tmlr/2024/deng2024tmlr-pretrained/}
}