Continuous Simplicial Neural Networks
Abstract
Simplicial complexes provide a powerful framework for modeling higher-order interactions in structured data, making them particularly suitable for applications such as trajectory prediction and mesh processing. However, existing simplicial neural networks (SNNs), whether convolutional or attention-based, rely primarily on discrete filtering techniques, which can be restrictive. In contrast, partial differential equations (PDEs) on simplicial complexes offer a principled approach to capture continuous dynamics in such structures. In this work, we introduce continuous simplicial neural network (COSIMO), a novel SNN architecture derived from PDEs on simplicial complexes. We provide theoretical and experimental justifications of COSIMO's stability under simplicial perturbations. Furthermore, we investigate the over-smoothing phenomenon—a common issue in geometric deep learning—demonstrating that COSIMO offers better control over this effect than discrete SNNs. Our experiments on real-world datasets demonstrate that COSIMO achieves competitive performance compared to state-of-the-art SNNs in complex and noisy environments. The implementation codes are available in https://github.com/ArefEinizade2/COSIMO.
Cite
Text
Einizade et al. "Continuous Simplicial Neural Networks." Advances in Neural Information Processing Systems, 2025.Markdown
[Einizade et al. "Continuous Simplicial Neural Networks." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/einizade2025neurips-continuous/)BibTeX
@inproceedings{einizade2025neurips-continuous,
title = {{Continuous Simplicial Neural Networks}},
author = {Einizade, Aref and Thanou, Dorina and Malliaros, Fragkiskos D. and Giraldo, Jhony H.},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/einizade2025neurips-continuous/}
}