Variational Integrator Networks for Physically Structured Embeddings

Abstract

Learning workable representations of dynamical systems is becoming an increasingly important problem in a number of application areas. By leveraging recent work connecting deep neural networks to systems of differential equations, we propose \emph{variational integrator networks}, a class of neural network architectures designed to preserve the geometric structure of physical systems. This class of network architectures facilitates accurate long-term prediction, interpretability, and data-efficient learning, while still remaining highly flexible and capable of modeling complex behavior. We demonstrate that they can accurately learn dynamical systems from both noisy observations in phase space and from image pixels within which the unknown dynamics are embedded.

Cite

Text

Saemundsson et al. "Variational Integrator Networks for Physically Structured Embeddings." Artificial Intelligence and Statistics, 2020.

Markdown

[Saemundsson et al. "Variational Integrator Networks for Physically Structured Embeddings." Artificial Intelligence and Statistics, 2020.](https://mlanthology.org/aistats/2020/saemundsson2020aistats-variational/)

BibTeX

@inproceedings{saemundsson2020aistats-variational,
  title     = {{Variational Integrator Networks for Physically Structured Embeddings}},
  author    = {Saemundsson, Steindor and Terenin, Alexander and Hofmann, Katja and Deisenroth, Marc},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2020},
  pages     = {3078-3087},
  volume    = {108},
  url       = {https://mlanthology.org/aistats/2020/saemundsson2020aistats-variational/}
}