Generalized Prompt Tuning: How to Use a Frozen Pre-Trained Univariate Time Series Foundation Model for Multivariate Time Series Prediction
Abstract
Time series foundation models are pre-trained on large datasets and are able to achieve state-of-the-art performance in diverse tasks. However, we observe that currently, the majority of time series foundation models either are univariate in nature, or assume channel independence, meaning that they handle multivariate time series but do not model how the different variables relate. In this paper, we propose a prompt-tuning-inspired fine-tuning technique, Generalized Prompt Tuning (Gen-P-Tuning), that enables us to adapt an existing univariate time series foundation model (treated as frozen) to handle multivariate time series prediction. Our approach provides a way to combine information across channels (variables) of multivariate time series. We demonstrate the effectiveness of our fine-tuning approach against various baselines on 8 classification and 4 forecasting datasets. Our code is available at: https://github.com/Ilovecodingforever/Gen-P-Tuning
Cite
Text
Liu et al. "Generalized Prompt Tuning: How to Use a Frozen Pre-Trained Univariate Time Series Foundation Model for Multivariate Time Series Prediction." NeurIPS 2024 Workshops: TSALM, 2024.Markdown
[Liu et al. "Generalized Prompt Tuning: How to Use a Frozen Pre-Trained Univariate Time Series Foundation Model for Multivariate Time Series Prediction." NeurIPS 2024 Workshops: TSALM, 2024.](https://mlanthology.org/neuripsw/2024/liu2024neuripsw-generalized/)BibTeX
@inproceedings{liu2024neuripsw-generalized,
title = {{Generalized Prompt Tuning: How to Use a Frozen Pre-Trained Univariate Time Series Foundation Model for Multivariate Time Series Prediction}},
author = {Liu, Mingzhu and Chen, Angela and Chen, George H.},
booktitle = {NeurIPS 2024 Workshops: TSALM},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/liu2024neuripsw-generalized/}
}