Vectorized Conditional Neural Fields: A Framework for Solving Time-Dependent PDEs
Abstract
Transformer models are increasingly used for solving Partial Differential Equations (PDEs). However, they lack at least one of several desirable properties of an ideal surrogate model such as (i) generalization to PDE parameters not seen during training, (ii) spatial and temporal zero-shot super-resolution, (iii) continuous temporal extrapolation, (iv) applicability to PDEs of different dimensionalities, and (v) efficient inference for longer temporal rollouts. To address these limitations, we propose Vectorized Conditional Neural Fields (VCNeFs) which represent the solution of time-dependent PDEs as neural fields. Contrary to prior methods, VCNeFs compute, for a set of multiple spatio-temporal query points, their solutions in parallel while also modeling their dependencies through attention mechanisms. Moreover, VCNeF can condition the neural field on both the initial conditions and the parameters of the PDEs. An extensive set of experiments demonstrates that VCNeFs are competitive with and often outperform existing ML-based surrogate models.
Cite
Text
Hagnberger et al. "Vectorized Conditional Neural Fields: A Framework for Solving Time-Dependent PDEs." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.Markdown
[Hagnberger et al. "Vectorized Conditional Neural Fields: A Framework for Solving Time-Dependent PDEs." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.](https://mlanthology.org/iclrw/2024/hagnberger2024iclrw-vectorized/)BibTeX
@inproceedings{hagnberger2024iclrw-vectorized,
title = {{Vectorized Conditional Neural Fields: A Framework for Solving Time-Dependent PDEs}},
author = {Hagnberger, Jan and Kalimuthu, Marimuthu and Niepert, Mathias},
booktitle = {ICLR 2024 Workshops: AI4DiffEqtnsInSci},
year = {2024},
url = {https://mlanthology.org/iclrw/2024/hagnberger2024iclrw-vectorized/}
}