Reimagining Time Series Foundation Models: Metadata and State-Space Model Perspectives
Abstract
The success of foundation models in natural language processing has sparked a growing interest in developing analogous models for time series (TS) analysis. These time series foundation models (TSFM), pre-trained on vast amounts of TS data, demonstrate capabilities of zero-shot and few-shot inference on unseen datasets. However, the intrinsic heterogeneity of TS data presents unique challenges: accurate inference often necessitates a deep understanding of the underlying data-generating process and the sensing apparatus, which cannot be readily inferred from the raw data alone. Furthermore, recent advances in state-space models raise the question of whether they may offer advantages over transformer-based architectures for TS analysis. This paper investigates these questions in two key areas: (a) a fair comparison of methods for integrating metadata into TSFMs and (b) the comparative effectiveness of state-space models (SSM) versus transformer models for TS forecasting. Our results, based on experiments across 11 datasets, suggest advantages for SSM building blocks as well as for incorporating the notion of real-world timestamps. More specifically, on our curated in-domain and out-of-domain datasets, an SSM approach incorporating timestamps outperforms three existing TSFMs on forecasting tasks while using 6,000$\times$ fewer trainable parameters and 10$\times$ less training data. The paper aims to highlight the potential for SSM building blocks and general directions for future TSFM research.
Cite
Text
Quan et al. "Reimagining Time Series Foundation Models: Metadata and State-Space Model Perspectives." NeurIPS 2024 Workshops: TSALM, 2024.Markdown
[Quan et al. "Reimagining Time Series Foundation Models: Metadata and State-Space Model Perspectives." NeurIPS 2024 Workshops: TSALM, 2024.](https://mlanthology.org/neuripsw/2024/quan2024neuripsw-reimagining/)BibTeX
@inproceedings{quan2024neuripsw-reimagining,
title = {{Reimagining Time Series Foundation Models: Metadata and State-Space Model Perspectives}},
author = {Quan, Pengrui and Mulayim, Ozan Baris and Han, Liying and Hong, Dezhi and Berges, Mario and Srivastava, Mani},
booktitle = {NeurIPS 2024 Workshops: TSALM},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/quan2024neuripsw-reimagining/}
}