Data-Efficient Operator Learning via Unsupervised Pretraining and In-Context Learning

Abstract

Recent years have witnessed the promise of coupling machine learning methods and physical domain-specific insights for solving scientific problems based on partial differential equations (PDEs). However, being data-intensive, these methods still require a large amount of PDE data. This reintroduces the need for expensive numerical PDE solutions, partially undermining the original goal of avoiding these expensive simulations. In this work, seeking data efficiency, we design unsupervised pretraining for PDE operator learning. To reduce the need for training data with heavy simulation costs, we mine unlabeled PDE data without simulated solutions,and we pretrain neural operators with physics-inspired reconstruction-based proxy tasks. To improve out-of-distribution performance, we further assist neural operators in flexibly leveraging a similarity-based method that learns in-context examples, without incurring extra training costs or designs. Extensive empirical evaluations on a diverse set of PDEs demonstrate that our method is highly data-efficient, more generalizable, and even outperforms conventional vision-pretrained models. We provide our code at https://github.com/delta-lab-ai/dataefficientnopt.

Cite

Text

Chen et al. "Data-Efficient Operator Learning via Unsupervised Pretraining and In-Context Learning." Neural Information Processing Systems, 2024. doi:10.52202/079017-0201

Markdown

[Chen et al. "Data-Efficient Operator Learning via Unsupervised Pretraining and In-Context Learning." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/chen2024neurips-dataefficient/) doi:10.52202/079017-0201

BibTeX

@inproceedings{chen2024neurips-dataefficient,
  title     = {{Data-Efficient Operator Learning via Unsupervised Pretraining and In-Context Learning}},
  author    = {Chen, Wuyang and Song, Jialin and Ren, Pu and Subramanian, Shashank and Morozov, Dmitriy and Mahoney, Michael W.},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-0201},
  url       = {https://mlanthology.org/neurips/2024/chen2024neurips-dataefficient/}
}