Continuously Updating Digital Twins Using Large Language Models
Abstract
Digital twins are models of real-world systems that can simulate their dynamics in response to potential actions. In complex settings, the state and action variables, and available data and knowledge relevant to a system can constantly change, requiring digital twins to continuously update with these changes to remain relevant. Current approaches struggle in this regard, as they require fixed, well-defined modelling environments, and they cannot adapt to novel variables without re-designs, or incorporate new information without re-training. To address this, we frame digital twinning as an in-context learning problem using large language models, enabling seamless updates to the twin at inference time. We develop CALM-DT, a Context-Adaptive Language Model-based Digital Twin that can accurately simulate across diverse state-action spaces using in-context learning alone by utilising fine-tuned encoders for sample retrieval. We empirically demonstrate CALM-DT’s competitive performance with existing digital twin approaches, and its unique ability to adapt to changes in its modelling environment without parameter updates.
Cite
Text
Amad et al. "Continuously Updating Digital Twins Using Large Language Models." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Amad et al. "Continuously Updating Digital Twins Using Large Language Models." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/amad2025icml-continuously/)BibTeX
@inproceedings{amad2025icml-continuously,
title = {{Continuously Updating Digital Twins Using Large Language Models}},
author = {Amad, Harry and Astorga, Nicolás and Van Der Schaar, Mihaela},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {1343-1366},
volume = {267},
url = {https://mlanthology.org/icml/2025/amad2025icml-continuously/}
}