Transformer Conformal Prediction for Time Series
Abstract
We present a conformal prediction method for time series using the Transformer architecture to capture long-memory and long-range dependencies. Specifically, we use the Transformer decoder as a conditional quantile estimator to predict the quantiles of prediction residuals, which are used to estimate the prediction interval. We hypothesize that the Transformer decoder benefits the estimation of the prediction interval by learning temporal dependencies across past prediction residuals. Our comprehensive experiments using simulated and real data empirically demonstrate the superiority of the proposed method compared to the existing state-of-the-art conformal prediction methods.
Cite
Text
Lee et al. "Transformer Conformal Prediction for Time Series." ICML 2024 Workshops: SPIGM, 2024.Markdown
[Lee et al. "Transformer Conformal Prediction for Time Series." ICML 2024 Workshops: SPIGM, 2024.](https://mlanthology.org/icmlw/2024/lee2024icmlw-transformer/)BibTeX
@inproceedings{lee2024icmlw-transformer,
title = {{Transformer Conformal Prediction for Time Series}},
author = {Lee, Junghwan and Xu, Chen and Xie, Yao},
booktitle = {ICML 2024 Workshops: SPIGM},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/lee2024icmlw-transformer/}
}