PDEformer: Towards a Foundation Model for One-Dimensional Partial Differential Equations

Abstract

This paper introduces PDEformer, a neural solver for partial differential equations (PDEs) capable of simultaneously addressing various types of PDEs. We propose to represent the PDE in the form of a computational graph, facilitating the seamless integration of both symbolic and numerical information inherent in a PDE. A graph Transformer and an implicit neural representation (INR) are employed to generate mesh-free predicted solutions. Following pretraining on data exhibiting a certain level of diversity, our model achieves zero-shot accuracies on benchmark datasets that is comparable to those of specifically trained expert models. Additionally, PDEformer demonstrates promising results in the inverse problem of PDE coefficient recovery.

Cite

Text

Ye et al. "PDEformer: Towards a Foundation Model for One-Dimensional Partial Differential Equations." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.

Markdown

[Ye et al. "PDEformer: Towards a Foundation Model for One-Dimensional Partial Differential Equations." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.](https://mlanthology.org/iclrw/2024/ye2024iclrw-pdeformer/)

BibTeX

@inproceedings{ye2024iclrw-pdeformer,
  title     = {{PDEformer: Towards a Foundation Model for One-Dimensional Partial Differential Equations}},
  author    = {Ye, Zhanhong and Huang, Xiang and Chen, Leheng and Liu, Hongsheng and Wang, Zidong and Dong, Bin},
  booktitle = {ICLR 2024 Workshops: AI4DiffEqtnsInSci},
  year      = {2024},
  url       = {https://mlanthology.org/iclrw/2024/ye2024iclrw-pdeformer/}
}