Physics Encoded Blocks in Residual Neural Network Architectures for Digital Twin Models

Abstract

Physics Informed Machine Learning has emerged as a popular approach for modeling and simulation in digital twins, enabling the generation of accurate models of processes and behaviors in real-world systems. However, existing methods either rely on simple loss regularizations that offer limited physics integration or employ highly specialized architectures that are difficult to generalize across diverse physical systems. This paper presents a generic approach based on a novel physics-encoded residual neural network (PERNN) architecture that seamlessly combines data-driven and physics-based analytical models to overcome these limitations. Our method integrates differentiable physics blocks–implementing mathematical operators from physics-based models–with feed-forward learning blocks, while intermediate residual blocks ensure stable gradient flow during training. Consequently, the model naturally adheres to the underlying physical principles even when prior physics knowledge is incomplete, thereby improving generalizability with low data requirements and reduced model complexity. We investigate our approach in two application domains. The first is a steering model for autonomous vehicles in a simulation environment, and the second is a digital twin for climate modeling using an ordinary differential equation (ODE)-based model of Net Ecosystem Exchange (NEE) to enable gap-filling in flux tower data. In both cases, our method outperforms conventional neural network approaches as well as state-of-the-art Physics Informed Machine Learning methods.

Cite

Text

Zia et al. "Physics Encoded Blocks in Residual Neural Network Architectures for Digital Twin Models." Machine Learning, 2025. doi:10.1007/S10994-025-06808-Y

Markdown

[Zia et al. "Physics Encoded Blocks in Residual Neural Network Architectures for Digital Twin Models." Machine Learning, 2025.](https://mlanthology.org/mlj/2025/zia2025mlj-physics/) doi:10.1007/S10994-025-06808-Y

BibTeX

@article{zia2025mlj-physics,
  title     = {{Physics Encoded Blocks in Residual Neural Network Architectures for Digital Twin Models}},
  author    = {Zia, Muhammad Saad and Houpert, Corentin and Anjum, Ashiq and Liu, Lu and Conway, Anthony and Peña-Ríos, Anasol},
  journal   = {Machine Learning},
  year      = {2025},
  pages     = {180},
  doi       = {10.1007/S10994-025-06808-Y},
  volume    = {114},
  url       = {https://mlanthology.org/mlj/2025/zia2025mlj-physics/}
}