UP2ME: Univariate Pre-Training to Multivariate Fine-Tuning as a General-Purpose Framework for Multivariate Time Series Analysis
Abstract
Despite the success of self-supervised pre-training in texts and images, applying it to multivariate time series (MTS) falls behind tailored methods for tasks like forecasting, imputation and anomaly detection. We propose a general-purpose framework, named UP2ME (Univariate Pre-training to Multivariate Fine-tuning). It conducts task-agnostic pre-training when downstream tasks are unspecified. Once the task and setting (e.g. forecasting length) are determined, it gives sensible solutions with frozen pre-trained parameters, which has not been achieved before. UP2ME is further refined by fine-tuning. A univariate-to-multivariate paradigm is devised to address the heterogeneity of temporal and cross-channel dependencies. In univariate pre-training, univariate instances with diverse lengths are generated for Masked AutoEncoder (MAE) pre-training, discarding cross-channel dependency. The pre-trained model handles downstream tasks by formulating them into specific mask-reconstruction problems. In multivariate fine-tuning, it constructs a dependency graph among channels using the pre-trained encoder to enhance cross-channel dependency capture. Experiments on eight real-world datasets show its SOTA performance in forecasting and imputation, approaching task-specific performance in anomaly detection. Our code is available at https://github.com/Thinklab-SJTU/UP2ME.
Cite
Text
Zhang et al. "UP2ME: Univariate Pre-Training to Multivariate Fine-Tuning as a General-Purpose Framework for Multivariate Time Series Analysis." International Conference on Machine Learning, 2024.Markdown
[Zhang et al. "UP2ME: Univariate Pre-Training to Multivariate Fine-Tuning as a General-Purpose Framework for Multivariate Time Series Analysis." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/zhang2024icml-up2me/)BibTeX
@inproceedings{zhang2024icml-up2me,
title = {{UP2ME: Univariate Pre-Training to Multivariate Fine-Tuning as a General-Purpose Framework for Multivariate Time Series Analysis}},
author = {Zhang, Yunhao and Liu, Minghao and Zhou, Shengyang and Yan, Junchi},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {59358-59381},
volume = {235},
url = {https://mlanthology.org/icml/2024/zhang2024icml-up2me/}
}