ENMA: Tokenwise Autoregression for Continuous Neural PDE Operators
Abstract
Solving time-dependent parametric partial differential equations (PDEs) remains a fundamental challenge for neural solvers, particularly when generalizing across a wide range of physical parameters and dynamics. When data is uncertain or incomplete—as is often the case—a natural approach is to turn to generative models. We introduce ENMA, a generative neural operator designed to model spatio-temporal dynamics arising from physical phenomena. ENMA predicts future dynamics in a compressed latent space using a generative masked autoregressive transformer trained with flow matching loss, enabling tokenwise generation. Irregularly sampled spatial observations are encoded into uniform latent representations via attention mechanisms and further compressed through a spatio-temporal convolutional encoder. This allows ENMA to perform in-context learning at inference time by conditioning on either past states of the target trajectory or auxiliary context trajectories with similar dynamics. The result is a robust and adaptable framework that generalizes to new PDE regimes and supports one-shot surrogate modeling of time-dependent parametric PDEs.
Cite
Text
Koupaï et al. "ENMA: Tokenwise Autoregression for Continuous Neural PDE Operators." Advances in Neural Information Processing Systems, 2025.Markdown
[Koupaï et al. "ENMA: Tokenwise Autoregression for Continuous Neural PDE Operators." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/koupai2025neurips-enma/)BibTeX
@inproceedings{koupai2025neurips-enma,
title = {{ENMA: Tokenwise Autoregression for Continuous Neural PDE Operators}},
author = {Koupaï, Armand Kassaï and Le Boudec, Lise and Serrano, Louis and Gallinari, Patrick},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/koupai2025neurips-enma/}
}