Stability-Informed Initialization of Neural Ordinary Differential Equations

Abstract

This paper addresses the training of Neural Ordinary Differential Equations (neural ODEs), and in particular explores the interplay between numerical integration techniques, stability regions, step size, and initialization techniques. It is shown how the choice of integration technique implicitly regularizes the learned model, and how the solver’s corresponding stability region affects training and prediction performance. From this analysis, a stability-informed parameter initialization technique is introduced. The effectiveness of the initialization method is displayed across several learning benchmarks and industrial applications.

Cite

Text

Westny et al. "Stability-Informed Initialization of Neural Ordinary Differential Equations." International Conference on Machine Learning, 2024.

Markdown

[Westny et al. "Stability-Informed Initialization of Neural Ordinary Differential Equations." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/westny2024icml-stabilityinformed/)

BibTeX

@inproceedings{westny2024icml-stabilityinformed,
  title     = {{Stability-Informed Initialization of Neural Ordinary Differential Equations}},
  author    = {Westny, Theodor and Mohammadi, Arman and Jung, Daniel and Frisk, Erik},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {52903-52914},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/westny2024icml-stabilityinformed/}
}