Latent Space Energy-Based Neural ODEs

Abstract

This paper introduces novel deep dynamical models designed to represent continuous-time sequences. Our approach employs a neural emission model to generate each data point in the time series through a non-linear transformation of a latent state vector. The evolution of these latent states is implicitly defined by a neural ordinary differential equation (ODE), with the initial state drawn from an informative prior distribution parameterized by an Energy-based model (EBM). This framework is extended to disentangle dynamic states from underlying static factors of variation, represented as time-invariant variables in the latent space. We train the model using maximum likelihood estimation with Markov chain Monte Carlo (MCMC) in an end-to-end manner. Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts, and can generalize to new dynamic parameterization, enabling long-horizon predictions.

Cite

Text

Cheng et al. "Latent Space Energy-Based Neural ODEs." Transactions on Machine Learning Research, 2025.

Markdown

[Cheng et al. "Latent Space Energy-Based Neural ODEs." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/cheng2025tmlr-latent/)

BibTeX

@article{cheng2025tmlr-latent,
  title     = {{Latent Space Energy-Based Neural ODEs}},
  author    = {Cheng, Sheng and Kong, Deqian and Xie, Jianwen and Lee, Kookjin and Wu, Ying Nian and Yang, Yezhou},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/cheng2025tmlr-latent/}
}