E-ProTran: Efficient Probabilistic Transformers for Forecasting
Abstract
Time series forecasting involves predicting future data points based on historical patterns and is critical for applications in fields such as healthcare, financial markets, and weather forecasting, where scalability and efficiency, particularly in training and inference times, are paramount. Transformers, known for their ability to handle long-range dependencies in sequential data, have shown promise in time series analysis. However, the complexity of transformer models can lead to overparameterization, extended training times, and scalability challenges, which can become even more problematic if the assumptions of the underlying generative model are overly complicated. In this paper, we introduce E-ProTran by re-designing a state-of-the-art transformer for probabilistic time series forecasting. We empirically demonstrate that E-ProTran maintains high performance while significantly enhancing efficiency without necessarily reconstructing the conditioned history. Our model incorporates simplified attention layers and design adjustments that reduce computational overhead without compromising accuracy, offering a more efficient and scalable solution for time series forecasting.
Cite
Text
Koyuncu et al. "E-ProTran: Efficient Probabilistic Transformers for Forecasting." ICML 2024 Workshops: SPIGM, 2024.Markdown
[Koyuncu et al. "E-ProTran: Efficient Probabilistic Transformers for Forecasting." ICML 2024 Workshops: SPIGM, 2024.](https://mlanthology.org/icmlw/2024/koyuncu2024icmlw-eprotran/)BibTeX
@inproceedings{koyuncu2024icmlw-eprotran,
title = {{E-ProTran: Efficient Probabilistic Transformers for Forecasting}},
author = {Koyuncu, Batuhan and Bauerschmidt, Tim Nico and Valera, Isabel},
booktitle = {ICML 2024 Workshops: SPIGM},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/koyuncu2024icmlw-eprotran/}
}