Unified Training of Universal Time Series Forecasting Transformers
Abstract
Deep learning for time series forecasting has traditionally operated within a one-model-per-dataset framework, limiting its potential to leverage the game-changing impact of large pre-trained models. The concept of universal forecasting, emerging from pre-training on a vast collection of time series datasets, envisions a single Large Time Series Model capable of addressing diverse downstream forecasting tasks. However, constructing such a model poses unique challenges specific to time series data: (i) cross-frequency learning, (ii) accommodating an arbitrary number of variates for multivariate time series, and (iii) addressing the varying distributional properties inherent in large-scale data. To address these challenges, we present novel enhancements to the conventional time series Transformer architecture, resulting in our proposed Masked Encoder-based Universal Time Series Forecasting Transformer (Moirai). Trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains, Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models. Code, data, and model weights can be found at https://github.com/SalesforceAIResearch/uni2ts.
Cite
Text
Woo et al. "Unified Training of Universal Time Series Forecasting Transformers." International Conference on Machine Learning, 2024.Markdown
[Woo et al. "Unified Training of Universal Time Series Forecasting Transformers." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/woo2024icml-unified/)BibTeX
@inproceedings{woo2024icml-unified,
title = {{Unified Training of Universal Time Series Forecasting Transformers}},
author = {Woo, Gerald and Liu, Chenghao and Kumar, Akshat and Xiong, Caiming and Savarese, Silvio and Sahoo, Doyen},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {53140-53164},
volume = {235},
url = {https://mlanthology.org/icml/2024/woo2024icml-unified/}
}