Hybrid Latent Representations for PDE Emulation
Abstract
For classical PDE solvers, adjusting the spatial resolution and time step offers a trade-off between speed and accuracy. Neural emulators often achieve better speed-accuracy trade-offs by operating on a compact representation of the PDE system. Coarsened PDE fields are a simple and effective representation, but cannot exploit fine spatial scales in the high-fidelity numerical solutions. Alternatively, unstructured latent representations provide efficient autoregressive rollouts, but cannot enforce local interactions or physical laws as inductive biases. To overcome these limitations, we introduce hybrid representations that augment coarsened PDE fields with spatially structured latent variables extracted from high-resolution inputs. Hybrid representations provide efficient rollouts, can be trained on a simple loss defined on coarsened PDE fields, and support hard physical constraints. When predicting fine- and coarse-scale features across multiple PDE emulation tasks, they outperform or match the speed-accuracy trade-offs of the best convolutional, attentional, Fourier operator-based and autoencoding baselines.
Cite
Text
Bekar et al. "Hybrid Latent Representations for PDE Emulation." Advances in Neural Information Processing Systems, 2025.Markdown
[Bekar et al. "Hybrid Latent Representations for PDE Emulation." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/bekar2025neurips-hybrid/)BibTeX
@inproceedings{bekar2025neurips-hybrid,
title = {{Hybrid Latent Representations for PDE Emulation}},
author = {Bekar, Ali Can and Agarwal, Siddhant and Hüttig, Christian and Tosi, Nicola and Greenberg, David S.},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/bekar2025neurips-hybrid/}
}