Multiple Physics Pretraining for Physical Surrogate Models
Abstract
We introduce multiple physics pretraining (MPP), an autoregressive task-agnostic pretraining approach for physical surrogate modeling. MPP involves training large surrogate models to predict the dynamics of multiple heterogeneous physical systems simultaneously by learning features that are broadly useful across diverse physical tasks. In order to learn effectively in this setting, we introduce a shared embedding and normalization strategy that projects the fields of multiple systems into a single shared embedding space. We validate the efficacy of our approach on both pretraining and downstream tasks. In pretraining, we show that a single MPP-pretrained model is able to match or outperform task-specific baselines on all training sub-tasks without the need for finetuning. For downstream tasks, we explore how the benefits of MPP scale with available finetuning data and demonstrate pretraining gains even across large physics gaps. We open-source our code and model weights trained at multiple scales for reproducibility and community experimentation.
Cite
Text
McCabe et al. "Multiple Physics Pretraining for Physical Surrogate Models." NeurIPS 2023 Workshops: AI4Science, 2023.Markdown
[McCabe et al. "Multiple Physics Pretraining for Physical Surrogate Models." NeurIPS 2023 Workshops: AI4Science, 2023.](https://mlanthology.org/neuripsw/2023/mccabe2023neuripsw-multiple/)BibTeX
@inproceedings{mccabe2023neuripsw-multiple,
title = {{Multiple Physics Pretraining for Physical Surrogate Models}},
author = {McCabe, Michael and Blancard, Bruno Régaldo-Saint and Parker, Liam Holden and Ohana, Ruben and Cranmer, Miles and Bietti, Alberto and Eickenberg, Michael and Golkar, Siavash and Krawezik, Geraud and Lanusse, Francois and Pettee, Mariel and Tesileanu, Tiberiu and Cho, Kyunghyun and Ho, Shirley},
booktitle = {NeurIPS 2023 Workshops: AI4Science},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/mccabe2023neuripsw-multiple/}
}