Neural ODE Processes: A Short Summary

Abstract

Neural Ordinary Differential Equations (NODEs) use a neural network to model the instantaneous rate of change in the state of a system. However, despite their apparent suitability for dynamics-governed time-series, NODEs present a few disadvantages. First, they are unable to adapt to incoming data-points, a fundamental requirement for real-time applications imposed by the natural direction of time. Second, time-series are often composed of a sparse set of measurements, which could be explained by many possible underlying dynamics. NODEs do not capture this uncertainty. To this end, we introduce Neural ODE Processes (NDPs), a new class of stochastic processes determined by a distribution over Neural ODEs. By maintaining an adaptive data-dependent distribution over the underlying ODE, we show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points. At the same time, we demonstrate that NDPs scale up to challenging high-dimensional time-series with unknown latent dynamics such as rotating MNIST digits. Code is available online at https://github.com/crisbodnar/ndp.

Cite

Text

Norcliffe et al. "Neural ODE Processes: A Short Summary." NeurIPS 2021 Workshops: DLDE, 2021.

Markdown

[Norcliffe et al. "Neural ODE Processes: A Short Summary." NeurIPS 2021 Workshops: DLDE, 2021.](https://mlanthology.org/neuripsw/2021/norcliffe2021neuripsw-neural/)

BibTeX

@inproceedings{norcliffe2021neuripsw-neural,
  title     = {{Neural ODE Processes: A Short Summary}},
  author    = {Norcliffe, Alexander Luke Ian and Bodnar, Cristian and Day, Ben and Moss, Jacob and Lio, Pietro},
  booktitle = {NeurIPS 2021 Workshops: DLDE},
  year      = {2021},
  url       = {https://mlanthology.org/neuripsw/2021/norcliffe2021neuripsw-neural/}
}