Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting
Abstract
Accurate prediction of the future given the past based on time series data is of paramount importance, since it opens the door for decision making and risk management ahead of time. In practice, the challenge is to build a flexible but parsimonious model that can capture a wide range of temporal dependencies. In this paper, we propose Pyraformer by exploring the multiresolution representation of the time series. Specifically, we introduce the pyramidal attention module (PAM) in which the inter-scale tree structure summarizes features at different resolutions and the intra-scale neighboring connections model the temporal dependencies of different ranges. Under mild conditions, the maximum length of the signal traversing path in Pyraformer is a constant (i.e., $\mathcal O(1)$) with regard to the sequence length $L$, while its time and space complexity scale linearly with $L$. Extensive numerical results show that Pyraformer typically achieves the highest prediction accuracy in both single-step and long-range forecasting tasks with the least amount of time and memory consumption, especially when the sequence is long.
Cite
Text
Liu et al. "Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting." International Conference on Learning Representations, 2022.Markdown
[Liu et al. "Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/liu2022iclr-pyraformer/)BibTeX
@inproceedings{liu2022iclr-pyraformer,
title = {{Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting}},
author = {Liu, Shizhan and Yu, Hang and Liao, Cong and Li, Jianguo and Lin, Weiyao and Liu, Alex X. and Dustdar, Schahram},
booktitle = {International Conference on Learning Representations},
year = {2022},
url = {https://mlanthology.org/iclr/2022/liu2022iclr-pyraformer/}
}