LAST SToP for Modeling Asynchronous Time Series
Abstract
We present a novel prompt design for Large Language Models (LLMs) tailored to Asynchronous Time Series. Unlike regular time series, which assume values at evenly spaced time points, asynchronous time series consist of timestamped events occurring at irregular intervals, each described in natural language. Our approach effectively utilizes the rich natural language of event descriptions, allowing LLMs to benefit from their broad world knowledge for reasoning across different domains and tasks. This allows us to extend the scope of asynchronous time series analysis beyond forecasting to include tasks like anomaly detection and data imputation. We further introduce Stochastic Soft Prompting, a novel prompt-tuning mechanism that significantly improves model performance, outperforming existing finetuning methods such as QLORA. Through extensive experiments on real-world datasets, we demonstrate that our approach achieves state-of-the-art performance across different tasks and datasets.
Cite
Text
Gupta et al. "LAST SToP for Modeling Asynchronous Time Series." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Gupta et al. "LAST SToP for Modeling Asynchronous Time Series." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/gupta2025icml-last/)BibTeX
@inproceedings{gupta2025icml-last,
title = {{LAST SToP for Modeling Asynchronous Time Series}},
author = {Gupta, Shubham and Durand, Thibaut and Taylor, Graham W. and Bialokozowicz, Lilian},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {21297-21321},
volume = {267},
url = {https://mlanthology.org/icml/2025/gupta2025icml-last/}
}