Latent Ordinary Differential Equations for Irregularly-Sampled Time Series
Abstract
Time series with non-uniform intervals occur in many applications, and are difficult to model using standard recurrent neural networks (RNNs). We generalize RNNs to have continuous-time hidden dynamics defined by ordinary differential equations (ODEs), a model we call ODE-RNNs. Furthermore, we use ODE-RNNs to replace the recognition network of the recently-proposed Latent ODE model. Both ODE-RNNs and Latent ODEs can naturally handle arbitrary time gaps between observations, and can explicitly model the probability of observation times using Poisson processes. We show experimentally that these ODE-based models outperform their RNN-based counterparts on irregularly-sampled data.
Cite
Text
Rubanova et al. "Latent Ordinary Differential Equations for Irregularly-Sampled Time Series." Neural Information Processing Systems, 2019.Markdown
[Rubanova et al. "Latent Ordinary Differential Equations for Irregularly-Sampled Time Series." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/rubanova2019neurips-latent/)BibTeX
@inproceedings{rubanova2019neurips-latent,
title = {{Latent Ordinary Differential Equations for Irregularly-Sampled Time Series}},
author = {Rubanova, Yulia and Chen, Ricky T. Q. and Duvenaud, David K.},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {5320-5330},
url = {https://mlanthology.org/neurips/2019/rubanova2019neurips-latent/}
}