Hypergraph Neural Architecture Search

Abstract

In recent years, Hypergraph Neural Networks (HGNNs) have achieved considerable success by manually designing architectures, which are capable of extracting effective patterns with high-order interactions from non-Euclidean data. However, such mechanism is extremely inefficient, demanding tremendous human efforts to tune diverse model parameters. In this paper, we propose a novel Hypergraph Neural Architecture Search (HyperNAS) to automatically design the optimal HGNNs. The proposed model constructs a search space suitable for hypergraphs, and derives hypergraph architectures through differentiable search strategies. A hypergraph structure-aware distance criterion is introduced as a guideline for obtaining an optimal hypergraph architecture via the leave-one-out method. Experimental results for node classification on benchmark Cora, Citeseer, Pubmed citation networks and hypergraph datasets show that HyperNAS outperforms existing HGNNs models and graph NAS methods.

Cite

Text

Lin et al. "Hypergraph Neural Architecture Search." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I12.29290

Markdown

[Lin et al. "Hypergraph Neural Architecture Search." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/lin2024aaai-hypergraph/) doi:10.1609/AAAI.V38I12.29290

BibTeX

@inproceedings{lin2024aaai-hypergraph,
  title     = {{Hypergraph Neural Architecture Search}},
  author    = {Lin, Wei and Peng, Xu and Yu, Zhengtao and Jin, Taisong},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {13837-13845},
  doi       = {10.1609/AAAI.V38I12.29290},
  url       = {https://mlanthology.org/aaai/2024/lin2024aaai-hypergraph/}
}