Lag-Llama: Towards Foundation Models for Time Series Forecasting
Abstract
Aiming to build foundation models for time-series forecasting and study their scaling behavior, we present here our work-in-progress on Lag-Llama, a general-purpose univariate probabilistic time-series forecasting model trained on a large collection of time-series data. The model shows good zero-shot prediction capabilities on unseen "out-of-distribution" time-series datasets, outperforming supervised baselines. We use smoothly broken power-laws to fit and predict model scaling behavior. The open source code is made available at https://github.com/kashif/pytorch-transformer-ts.
Cite
Text
Rasul et al. "Lag-Llama: Towards Foundation Models for Time Series Forecasting." NeurIPS 2023 Workshops: R0-FoMo, 2023.Markdown
[Rasul et al. "Lag-Llama: Towards Foundation Models for Time Series Forecasting." NeurIPS 2023 Workshops: R0-FoMo, 2023.](https://mlanthology.org/neuripsw/2023/rasul2023neuripsw-lagllama/)BibTeX
@inproceedings{rasul2023neuripsw-lagllama,
title = {{Lag-Llama: Towards Foundation Models for Time Series Forecasting}},
author = {Rasul, Kashif and Ashok, Arjun and Williams, Andrew Robert and Khorasani, Arian and Adamopoulos, George and Bhagwatkar, Rishika and Biloš, Marin and Ghonia, Hena and Hassen, Nadhir and Schneider, Anderson and Garg, Sahil and Drouin, Alexandre and Chapados, Nicolas and Nevmyvaka, Yuriy and Rish, Irina},
booktitle = {NeurIPS 2023 Workshops: R0-FoMo},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/rasul2023neuripsw-lagllama/}
}