Multi-Patch Prediction: Adapting Language Models for Time Series Representation Learning

Abstract

In this study, we present $\text{aL\small{LM}4T\small{S}}$, an innovative framework that adapts Large Language Models (LLMs) for time-series representation learning. Central to our approach is that we reconceive time-series forecasting as a self-supervised, multi-patch prediction task, which, compared to traditional mask-and-reconstruction methods, captures temporal dynamics in patch representations more effectively. Our strategy encompasses two-stage training: (i). a causal continual pre-training phase on various time-series datasets, anchored on next patch prediction, effectively syncing LLM capabilities with the intricacies of time-series data; (ii). fine-tuning for multi-patch prediction in the targeted time-series context. A distinctive element of our framework is the patch-wise decoding layer, which departs from previous methods reliant on sequence-level decoding. Such a design directly transposes individual patches into temporal sequences, thereby significantly bolstering the model’s proficiency in mastering temporal patch-based representations. $\text{aL\small{LM}4T\small{S}}$ demonstrates superior performance in several downstream tasks, proving its effectiveness in deriving temporal representations with enhanced transferability and marking a pivotal advancement in the adaptation of LLMs for time-series analysis.

Cite

Text

Bian et al. "Multi-Patch Prediction: Adapting Language Models for Time Series Representation Learning." International Conference on Machine Learning, 2024.

Markdown

[Bian et al. "Multi-Patch Prediction: Adapting Language Models for Time Series Representation Learning." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/bian2024icml-multipatch/)

BibTeX

@inproceedings{bian2024icml-multipatch,
  title     = {{Multi-Patch Prediction: Adapting Language Models for Time Series Representation Learning}},
  author    = {Bian, Yuxuan and Ju, Xuan and Li, Jiangtong and Xu, Zhijian and Cheng, Dawei and Xu, Qiang},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {3889-3912},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/bian2024icml-multipatch/}
}