HyperMagNet: A Magnetic Laplacian Based Hypergraph Neural Network

Abstract

In data science, hypergraphs are natural models for data exhibiting multi-way or group relationships in contrast to graphs which only model pairwise relationships. Nonetheless, many proposed hypergraph neural networks effectively reduce hypergraphs to undirected graphs via symmetrized matrix representations, potentially losing important multi-way or group information. We propose an alternative approach to hypergraph neural networks in which the hypergraph is represented as a non-reversible Markov chain. We use this Markov chain to construct a complex Hermitian Laplacian matrix — the magnetic Laplacian — which serves as the input to our proposed hypergraph neural network. We study $\textit{HyperMagNet}$ for the task of node classification, and demonstrate its effectiveness over graph-reduction based hypergraph neural networks.

Cite

Text

Benko et al. "HyperMagNet: A Magnetic Laplacian Based Hypergraph Neural Network." Transactions on Machine Learning Research, 2025.

Markdown

[Benko et al. "HyperMagNet: A Magnetic Laplacian Based Hypergraph Neural Network." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/benko2025tmlr-hypermagnet/)

BibTeX

@article{benko2025tmlr-hypermagnet,
  title     = {{HyperMagNet: A Magnetic Laplacian Based Hypergraph Neural Network}},
  author    = {Benko, Tatyana and Buck, Martin and Amburg, Ilya and Young, Stephen J. and Aksoy, Sinan Guven},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/benko2025tmlr-hypermagnet/}
}