Latent Diffusion Transformer with Local Neural Field as PDE Surrogate Model

Abstract

We introduce a diffusion transformer architecture, AROMA (Attentive Reduced Order Model with Attention), for dynamics modeling of complex systems. By employing a discretization-free encoder and a local neural field decoder, we construct a latent space that accurately captures spatiality without requiring traditional space discretization. The diffusion transformer models the dynamics in this latent space conditioned on the previous state. It refines the predictions providing enhanced stability compared to traditional transformers and then enabling longer rollouts. AROMA demonstrates superior performance over existing neural field methods in simulating 1D and 2D equations, highlighting the effectiveness of our approach in capturing complex dynamical behaviors.

Cite

Text

Serrano et al. "Latent Diffusion Transformer with Local Neural Field as PDE Surrogate Model." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.

Markdown

[Serrano et al. "Latent Diffusion Transformer with Local Neural Field as PDE Surrogate Model." ICLR 2024 Workshops: AI4DiffEqtnsInSci, 2024.](https://mlanthology.org/iclrw/2024/serrano2024iclrw-latent/)

BibTeX

@inproceedings{serrano2024iclrw-latent,
  title     = {{Latent Diffusion Transformer with Local Neural Field as PDE Surrogate Model}},
  author    = {Serrano, Louis and Vittaut, Jean-Noël and Gallinari, Patrick},
  booktitle = {ICLR 2024 Workshops: AI4DiffEqtnsInSci},
  year      = {2024},
  url       = {https://mlanthology.org/iclrw/2024/serrano2024iclrw-latent/}
}