Learning and Transferring Physical Models Through Derivatives
Abstract
We propose Derivative Learning (DERL), a supervised approach that models physical systems by learning their partial derivatives. We also leverage DERL to build physical models incrementally, by designing a distillation protocol that effectively transfers knowledge from a pre-trained model to a student one. We provide theoretical guarantees that DERL can learn the true physical system, being consistent with the underlying physical laws, even when using empirical derivatives. DERL outperforms state-of-the-art methods in generalizing an ODE to unseen initial conditions and a parametric PDE to unseen parameters. We also design a method based on DERL to transfer physical knowledge across models by extending them to new portions of the physical domain and a new range of PDE parameters. This introduces a new pipeline to build physical models incrementally in multiple stages.
Cite
Text
Trenta et al. "Learning and Transferring Physical Models Through Derivatives." Transactions on Machine Learning Research, 2026.Markdown
[Trenta et al. "Learning and Transferring Physical Models Through Derivatives." Transactions on Machine Learning Research, 2026.](https://mlanthology.org/tmlr/2026/trenta2026tmlr-learning/)BibTeX
@article{trenta2026tmlr-learning,
title = {{Learning and Transferring Physical Models Through Derivatives}},
author = {Trenta, Alessandro and Cossu, Andrea and Bacciu, Davide},
journal = {Transactions on Machine Learning Research},
year = {2026},
url = {https://mlanthology.org/tmlr/2026/trenta2026tmlr-learning/}
}