Semi-Implicit Neural Ordinary Differential Equations

Abstract

Classical neural ODEs trained with explicit methods are intrinsically limited by stability, crippling their efficiency and robustness for stiff learning problems that are common in graph learning and scientific machine learning. We present a semi-implicit neural ODE approach that exploits the partitionable structure of the underlying dynamics. Our technique leads to an implicit neural network with significant computational advantages over existing approaches because of enhanced stability and efficient linear solves during time integration. We show that our approach outperforms existing approaches on a variety of applications including graph classification and learning complex dynamical systems. We also demonstrate that our approach can train challenging neural ODEs where both explicit methods and fully implicit methods are intractable.

Cite

Text

Zhang et al. "Semi-Implicit Neural Ordinary Differential Equations." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I21.34398

Markdown

[Zhang et al. "Semi-Implicit Neural Ordinary Differential Equations." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/zhang2025aaai-semi/) doi:10.1609/AAAI.V39I21.34398

BibTeX

@inproceedings{zhang2025aaai-semi,
  title     = {{Semi-Implicit Neural Ordinary Differential Equations}},
  author    = {Zhang, Hong and Liu, Ying and Maulik, Romit},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {22416-22424},
  doi       = {10.1609/AAAI.V39I21.34398},
  url       = {https://mlanthology.org/aaai/2025/zhang2025aaai-semi/}
}