Symplectic Recurrent Neural Networks

Abstract

We propose Symplectic Recurrent Neural Networks (SRNNs) as learning algorithms that capture the dynamics of physical systems from observed trajectories. SRNNs model the Hamiltonian function of the system by a neural networks, and leverage symplectic integration, multiple-step training and initial state optimization to address the challenging numerical issues associated with Hamiltonian systems. We show SRNNs succeed reliably on complex and noisy Hamiltonian systems. Finally, we show how to augment the SRNN integration scheme in order to handle stiff dynamical systems such as bouncing billiards.

Cite

Text

Chen et al. "Symplectic Recurrent Neural Networks." International Conference on Learning Representations, 2020.

Markdown

[Chen et al. "Symplectic Recurrent Neural Networks." International Conference on Learning Representations, 2020.](https://mlanthology.org/iclr/2020/chen2020iclr-symplectic/)

BibTeX

@inproceedings{chen2020iclr-symplectic,
  title     = {{Symplectic Recurrent Neural Networks}},
  author    = {Chen, Zhengdao and Zhang, Jianyu and Arjovsky, Martin and Bottou, Léon},
  booktitle = {International Conference on Learning Representations},
  year      = {2020},
  url       = {https://mlanthology.org/iclr/2020/chen2020iclr-symplectic/}
}