In-Context Time Series Predictor

Abstract

Recent Transformer-based large language models (LLMs) demonstrate in-context learning ability to perform various functions based solely on the provided context, without updating model parameters. To fully utilize the in-context capabilities in time series forecasting (TSF) problems, unlike previous Transformer-based or LLM-based time series forecasting methods, we reformulate "time series forecasting tasks" as input tokens by constructing a series of (lookback, future) pairs within the tokens. This method aligns more closely with the inherent in-context mechanisms and is more parameter-efficient without the need of using pre-trained LLM parameters. Furthermore, it addresses issues such as overfitting in existing Transformer-based TSF models, consistently achieving better performance across full-data, few-shot, and zero-shot settings compared to previous architectures.

Cite

Text

Lu et al. "In-Context Time Series Predictor." International Conference on Learning Representations, 2025.

Markdown

[Lu et al. "In-Context Time Series Predictor." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/lu2025iclr-incontext/)

BibTeX

@inproceedings{lu2025iclr-incontext,
  title     = {{In-Context Time Series Predictor}},
  author    = {Lu, Jiecheng and Sun, Yan and Yang, Shihao},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/lu2025iclr-incontext/}
}