Discovering Lyapunov Functions with Transformers
Abstract
We consider a long-standing open problem in mathematics: discovering the Lyapunov functions that control the global stability of dynamical systems. We propose a method for generating training data, and train sequence-to-sequence transformers to predict the Lyapunov functions of polynomial and non-polynomial systems with high accuracy. We also introduce a new baseline for this problem, and show that our models achieve state-of-the-art results, and outperform approximation based techniques and sum-of-square algorithmic routines.
Cite
Text
Alfarano et al. "Discovering Lyapunov Functions with Transformers." NeurIPS 2023 Workshops: MATH-AI, 2023.Markdown
[Alfarano et al. "Discovering Lyapunov Functions with Transformers." NeurIPS 2023 Workshops: MATH-AI, 2023.](https://mlanthology.org/neuripsw/2023/alfarano2023neuripsw-discovering/)BibTeX
@inproceedings{alfarano2023neuripsw-discovering,
title = {{Discovering Lyapunov Functions with Transformers}},
author = {Alfarano, Alberto and Charton, Francois and Hayat, Amaury},
booktitle = {NeurIPS 2023 Workshops: MATH-AI},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/alfarano2023neuripsw-discovering/}
}