On Numerical Integration in Neural Ordinary Differential Equations

Abstract

The combination of ordinary differential equations and neural networks, i.e., neural ordinary differential equations (Neural ODE), has been widely studied from various angles. However, deciphering the numerical integration in Neural ODE is still an open challenge, as many researches demonstrated that numerical integration significantly affects the performance of the model. In this paper, we propose the inverse modified differential equations (IMDE) to clarify the influence of numerical integration on training Neural ODE models. IMDE is determined by the learning task and the employed ODE solver. It is shown that training a Neural ODE model actually returns a close approximation of the IMDE, rather than the true ODE. With the help of IMDE, we deduce that (i) the discrepancy between the learned model and the true ODE is bounded by the sum of discretization error and learning loss; (ii) Neural ODE using non-symplectic numerical integration fail to learn conservation laws theoretically. Several experiments are performed to numerically verify our theoretical analysis.

Cite

Text

Zhu et al. "On Numerical Integration in Neural Ordinary Differential Equations." International Conference on Machine Learning, 2022.

Markdown

[Zhu et al. "On Numerical Integration in Neural Ordinary Differential Equations." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/zhu2022icml-numerical/)

BibTeX

@inproceedings{zhu2022icml-numerical,
  title     = {{On Numerical Integration in Neural Ordinary Differential Equations}},
  author    = {Zhu, Aiqing and Jin, Pengzhan and Zhu, Beibei and Tang, Yifa},
  booktitle = {International Conference on Machine Learning},
  year      = {2022},
  pages     = {27527-27547},
  volume    = {162},
  url       = {https://mlanthology.org/icml/2022/zhu2022icml-numerical/}
}