Analysis of Generalization Capacities of Neural Ordinary Differential Equations

Abstract

Neural ordinary differential equations (neural ODEs) represent a widely used class of deep learning models characterized by continuous depth. Understanding the generalization error bound is important to evaluate how well a model is expected to perform on new, unseen data. Earlier works in this direction involved considering the linear case on the dynamics function (a function that models the evolution of state variables) of Neural ODE Marion (2023). Other related work is on bound for Neural Controlled ODE Bleistein & Guilloux (2023) that depends on the sampling gap. We consider a class of neural ordinary differential equations (ODEs) with a general nonlinear function for time-dependent and time-independent cases which is Lipschitz with respect to state variables. We observed that the solution of the neural ODEs would be of bounded variations if we assume that the dynamics function of Neural ODEs is Lipschitz continuous with respect to the hidden state. We derive a generalization bound for the time-dependent and time-independent Neural ODEs. We showed the effect of overparameterization and domain bound in the generalization error bound. This is the first time, the generalization bound for the Neural ODE with a general non-linear function has been found.

Cite

Text

Verma and Kumar. "Analysis of Generalization Capacities of Neural Ordinary Differential Equations." Transactions on Machine Learning Research, 2025.

Markdown

[Verma and Kumar. "Analysis of Generalization Capacities of Neural Ordinary Differential Equations." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/verma2025tmlr-analysis/)

BibTeX

@article{verma2025tmlr-analysis,
  title     = {{Analysis of Generalization Capacities of Neural Ordinary Differential Equations}},
  author    = {Verma, Madhusudan and Kumar, Manoj},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/verma2025tmlr-analysis/}
}