Lie Point Symmetry and Physics Informed Networks

Abstract

Physics-informed neural networks (PINNs) are computationally efficient alternatives to traditional partial differential equation (PDE) solvers. However, their reliability is dependent on the accuracy of the trained neural network. In this work, we introduce a mechanism for leveraging the symmetries of a given PDE to improve PINN performance. In particular, we propose a loss function that informs the network about Lie point symmetries, similar to how traditional PINN models try to enforce the underlying PDE. Intuitively, our symmetry loss ensures that infinitesimal generators of the Lie group preserve solutions of the PDE. Effectively, this means that once the network learns a solution, it also learns the neighbouring solutions generated by Lie point symmetries. Our results confirm that Lie point symmetries of the respective PDEs are an effective inductive bias for PINNs and can lead to a significant increase in sample efficiency.

Cite

Text

Akhound-Sadegh et al. "Lie Point Symmetry and Physics Informed Networks." ICML 2023 Workshops: TAGML, 2023.

Markdown

[Akhound-Sadegh et al. "Lie Point Symmetry and Physics Informed Networks." ICML 2023 Workshops: TAGML, 2023.](https://mlanthology.org/icmlw/2023/akhoundsadegh2023icmlw-lie/)

BibTeX

@inproceedings{akhoundsadegh2023icmlw-lie,
  title     = {{Lie Point Symmetry and Physics Informed Networks}},
  author    = {Akhound-Sadegh, Tara and Perreault-Levasseur, Laurence and Brandstetter, Johannes and Welling, Max and Ravanbakhsh, Siamak},
  booktitle = {ICML 2023 Workshops: TAGML},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/akhoundsadegh2023icmlw-lie/}
}