Language Pre-Training Guided Masking Representation Learning for Time Series Classification
Abstract
The representation learning of time series has a wide range of downstream tasks and applications in many practical scenarios. However, due to the complexity, spatiotemporality, and continuity of sequential stream data, compared with the representation learning of structural data such as images/videos, the time series self-supervised representation learning is even more challenging. Besides, the direct application of existing contrastive learning and masked autoencoder based approaches to time series representation learning encounters inherent theoretical limitations, such as ineffective augmentation and masking strategies. To this end, we propose a Language Pre-training guided Masking Representation Learning (LPMRL) for times series classification. Specifically, we first propose a novel language pre-training guided masking encoder for adaptively sampling semantic spatiotemporal patches via natural language descriptions and improving the discriminability of latent representations. Furthermore, we present the dual-information contrastive learning mechanism to explore both local and global information by meticulously designing high-quality hard negative samples of time series data samples. As a result, we also design various experiments, such as visualization of masking position and distribution and reconstruction error to verify the reasonability of proposed language guided masking technique. Last, we evaluate the performance of proposed representation learning via classification task conducted on 106 time series datasets, which demonstrates the effectiveness of proposed method.
Cite
Text
Tang et al. "Language Pre-Training Guided Masking Representation Learning for Time Series Classification." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I12.33377Markdown
[Tang et al. "Language Pre-Training Guided Masking Representation Learning for Time Series Classification." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/tang2025aaai-language/) doi:10.1609/AAAI.V39I12.33377BibTeX
@inproceedings{tang2025aaai-language,
title = {{Language Pre-Training Guided Masking Representation Learning for Time Series Classification}},
author = {Tang, Liaoyuan and Wang, Zheng and Wang, Jie and He, Guanxiong and Hao, Zhezheng and Wang, Rong and Nie, Feiping},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {12631-12639},
doi = {10.1609/AAAI.V39I12.33377},
url = {https://mlanthology.org/aaai/2025/tang2025aaai-language/}
}