Physics-Informed Transformer Networks
Abstract
Physics-informed neural networks (PINNs) have been recognized as a viable alternative to conventional numerical solvers for Partial Differential Equations (PDEs). The main appeal of PINNs is that since they directly enforce the PDE equation, one does not require access to costly ground truth solutions for training the model. However, a key challenge is their limited generalization across varied initial conditions. Addressing this, our study presents a novel Physics-Informed Transformer (PIT) model for learning the solution operator for PDEs. Using the attention mechanism, PIT learns to leverage the relationships between its initial condition and query points, resulting in a significant improvement in generalization. Moreover, in contrast to existing physics-informed networks, our model is invariant to the discretization of the input domain, providing great flexibility in problem specification and training. We validated our proposed method on the 1D Burgers’ and the 2D Heat equations, demonstrating notable improvement over standard PINN models for operator learning with negligible computational overhead.
Cite
Text
Dos Santos et al. "Physics-Informed Transformer Networks." NeurIPS 2023 Workshops: DLDE, 2023.Markdown
[Dos Santos et al. "Physics-Informed Transformer Networks." NeurIPS 2023 Workshops: DLDE, 2023.](https://mlanthology.org/neuripsw/2023/santos2023neuripsw-physicsinformed/)BibTeX
@inproceedings{santos2023neuripsw-physicsinformed,
title = {{Physics-Informed Transformer Networks}},
author = {Dos Santos, Fabricio and Akhound-Sadegh, Tara and Ravanbakhsh, Siamak},
booktitle = {NeurIPS 2023 Workshops: DLDE},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/santos2023neuripsw-physicsinformed/}
}