Multiscale Neural Operator: Learning Fast and Grid-Independent PDE Solvers
Abstract
Numerical simulations in climate, chemistry, or astrophysics are computationally too expensive for uncertainty quantification or parameter-exploration at high-resolution. Reduced-order or surrogate models are multiple orders of magnitude faster, but traditional surrogates are inflexible or inaccurate and pure machine learning (ML)-based surrogates too data-hungry. We propose a hybrid, flexible surrogate model that exploits known physics for simulating large-scale dynamics and limits learning to the hard-to-model term, which is called parametrization or closure and captures the effect of fine- onto large-scale dynamics. Leveraging neural operators, we are the first to learn grid-independent, non-local, and flexible parametrizations. Our $\textit{multiscale neural operator}$ is motivated by a rich literature in multiscale modeling, has quasilinear runtime complexity, is more accurate or flexible than state-of-the-art parametrizations and demonstrated on the chaotic equation multiscale Lorenz96.
Cite
Text
Lütjens et al. "Multiscale Neural Operator: Learning Fast and Grid-Independent PDE Solvers." ICML 2022 Workshops: AI4Science, 2022.Markdown
[Lütjens et al. "Multiscale Neural Operator: Learning Fast and Grid-Independent PDE Solvers." ICML 2022 Workshops: AI4Science, 2022.](https://mlanthology.org/icmlw/2022/lutjens2022icmlw-multiscale/)BibTeX
@inproceedings{lutjens2022icmlw-multiscale,
title = {{Multiscale Neural Operator: Learning Fast and Grid-Independent PDE Solvers}},
author = {Lütjens, Björn and Crawford, Catherine H. and Watson, Campbell D and Hill, Christopher and Newman, Dava},
booktitle = {ICML 2022 Workshops: AI4Science},
year = {2022},
url = {https://mlanthology.org/icmlw/2022/lutjens2022icmlw-multiscale/}
}