InParformer: Evolutionary Decomposition Transformers with Interactive Parallel Attention for Long-Term Time Series Forecasting
Abstract
Long-term time series forecasting (LTSF) provides substantial benefits for numerous real-world applications, whereas places essential demands on the model capacity to capture long-range dependencies. Recent Transformer-based models have significantly improved LTSF performance. It is worth noting that Transformer with the self-attention mechanism was originally proposed to model language sequences whose tokens (i.e., words) are discrete and highly semantic. However, unlike language sequences, most time series are sequential and continuous numeric points. Time steps with temporal redundancy are weakly semantic, and only leveraging time-domain tokens is hard to depict the overall properties of time series (e.g., the overall trend and periodic variations). To address these problems, we propose a novel Transformer-based forecasting model named InParformer with an Interactive Parallel Attention (InPar Attention) mechanism. The InPar Attention is proposed to learn long-range dependencies comprehensively in both frequency and time domains. To improve its learning capacity and efficiency, we further design several mechanisms, including query selection, key-value pair compression, and recombination. Moreover, InParformer is constructed with evolutionary seasonal-trend decomposition modules to enhance intricate temporal pattern extraction. Extensive experiments on six real-world benchmarks show that InParformer outperforms the state-of-the-art forecasting Transformers.
Cite
Text
Cao et al. "InParformer: Evolutionary Decomposition Transformers with Interactive Parallel Attention for Long-Term Time Series Forecasting." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I6.25845Markdown
[Cao et al. "InParformer: Evolutionary Decomposition Transformers with Interactive Parallel Attention for Long-Term Time Series Forecasting." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/cao2023aaai-inparformer/) doi:10.1609/AAAI.V37I6.25845BibTeX
@inproceedings{cao2023aaai-inparformer,
title = {{InParformer: Evolutionary Decomposition Transformers with Interactive Parallel Attention for Long-Term Time Series Forecasting}},
author = {Cao, Haizhou and Huang, Zhenhao and Yao, Tiechui and Wang, Jue and He, Hui and Wang, Yangang},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {6906-6915},
doi = {10.1609/AAAI.V37I6.25845},
url = {https://mlanthology.org/aaai/2023/cao2023aaai-inparformer/}
}