Zebra: A Continuous Generative Transformer for Solving Parametric PDEs

Abstract

Foundation models have revolutionized deep learning, moving beyond task-specific architectures to versatile models pre-trained using self-supervised learning on extensive datasets. These models have set new benchmarks across domains, including natural language processing, computer vision, and biology, due to their adaptability and state-of-the-art performance on downstream tasks. Yet, for solving PDEs or modeling physical dynamics, the potential of foundation models remains untapped due to the limited scale of existing datasets. This study presents Zebra, a novel generative model that adapts language model techniques to the continuous domain of PDE solutions. Pre-trained on specific PDE families, Zebra excels in dynamic forecasting, surpassing existing neural operators and solvers, and establishes a promising path for foundation models extensively pre-trained on varied PDE scenarios to tackle PDE challenges with scarce data.

Cite

Text

Serrano et al. "Zebra: A Continuous Generative Transformer for Solving Parametric PDEs." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.

Markdown

[Serrano et al. "Zebra: A Continuous Generative Transformer for Solving Parametric PDEs." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.](https://mlanthology.org/iclrw/2024/serrano2024iclrw-zebra/)

BibTeX

@inproceedings{serrano2024iclrw-zebra,
  title     = {{Zebra: A Continuous Generative Transformer for Solving Parametric PDEs}},
  author    = {Serrano, Louis and Erbacher, Pierre and Vittaut, Jean-Noël and Gallinari, Patrick},
  booktitle = {ICLR 2024 Workshops: AI4DiffEqtnsInSci},
  year      = {2024},
  url       = {https://mlanthology.org/iclrw/2024/serrano2024iclrw-zebra/}
}