Global Lyapunov Functions: A Long-Standing Open Problem in Mathematics, with Symbolic Transformers

Abstract

Despite their spectacular progress, language models still struggle on complex reasoning tasks, such as advanced mathematics.We consider a long-standing open problem in mathematics: discovering a Lyapunov function that ensures the global stability of a dynamical system. This problem has no known general solution, and algorithmic solvers only exist for some small polynomial systems.We propose a new method for generating synthetic training samples from random solutions, and show that sequence-to-sequence transformers trained on such datasets perform better than algorithmic solvers and humans on polynomial systems, and can discover new Lyapunov functions for non-polynomial systems.

Cite

Text

Alfarano et al. "Global Lyapunov Functions: A Long-Standing Open Problem in Mathematics, with Symbolic Transformers." Neural Information Processing Systems, 2024. doi:10.52202/079017-2969

Markdown

[Alfarano et al. "Global Lyapunov Functions: A Long-Standing Open Problem in Mathematics, with Symbolic Transformers." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/alfarano2024neurips-global/) doi:10.52202/079017-2969

BibTeX

@inproceedings{alfarano2024neurips-global,
  title     = {{Global Lyapunov Functions: A Long-Standing Open Problem in Mathematics, with Symbolic Transformers}},
  author    = {Alfarano, Alberto and Charton, François and Hayat, Amaury},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-2969},
  url       = {https://mlanthology.org/neurips/2024/alfarano2024neurips-global/}
}