HyperMixup: Hypergraph-Augmented with Higher-Order Information Mixup

Abstract

Hypergraphs offer a natural paradigm for modeling complex systems with multi-way interactions. Hypergraph neural networks (HGNNs) have demonstrated remarkable success in learning from such higher-order relational data. While such higher-order modeling enhances relational reasoning, the effectiveness of hypergraph learning remains bottlenecked by two persistent challenges: the scarcity of labeled data inherent to complex systems, and the vulnerability to structural noise in real-world interaction patterns. Traditional data augmentation methods, though successful in Euclidean and graph-structured domains, struggle to preserve the intricate balance between node features and hyperedge semantics, often disrupting the very group-wise interactions that define hypergraph value. To bridge this gap, we present HyperMixup, a hypergraph-aware augmentation framework that preserves higher-order interaction patterns through structure-guided feature mixing. Specifically, HyperMixup contains three critical components: 1) Structure-aware node pairing guided by joint feature-hyperedge similarity metrics, 2) Context-enhanced hierarchical mixing that preserves hyperedge semantics through dual-level feature fusion, and 3) Adaptive topology reconstruction mechanisms that maintain hypergraph consistency while enabling controlled diversity expansion. Theoretically, we establish that our method induces hypergraph-specific regularization effects through gradient alignment with hyperedge covariance structures, while providing robustness guarantees against combined node-hyperedge perturbations. Comprehensive experiments across diverse hypergraph learning tasks demonstrate consistent performance improvements over state-of-the-art baselines, with particular effectiveness in low-label regimes. The proposed framework advances hypergraph representation learning by unifying data augmentation with higher-order topological constraints, offering both practical utility and theoretical insights for relational machine learning.

Cite

Text

Yao et al. "HyperMixup: Hypergraph-Augmented with Higher-Order Information Mixup." Advances in Neural Information Processing Systems, 2025.

Markdown

[Yao et al. "HyperMixup: Hypergraph-Augmented with Higher-Order Information Mixup." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/yao2025neurips-hypermixup/)

BibTeX

@inproceedings{yao2025neurips-hypermixup,
  title     = {{HyperMixup: Hypergraph-Augmented with Higher-Order Information Mixup}},
  author    = {Yao, Kaixuan and Li, Zhuo and Liang, Jianqing and Liang, Jiye and Li, Ming and Cao, Feilong},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/yao2025neurips-hypermixup/}
}