Conformal Prediction Interval for Dynamic Time-Series

Abstract

We develop a method to construct distribution-free prediction intervals for dynamic time-series, called \Verb|EnbPI| that wraps around any bootstrap ensemble estimator to construct sequential prediction intervals. \Verb|EnbPI| is closely related to the conformal prediction (CP) framework but does not require data exchangeability. Theoretically, these intervals attain finite-sample, \textit{approximately valid} marginal coverage for broad classes of regression functions and time-series with strongly mixing stochastic errors. Computationally, \Verb|EnbPI| avoids overfitting and requires neither data-splitting nor training multiple ensemble estimators; it efficiently aggregates bootstrap estimators that have been trained. In general, \Verb|EnbPI| is easy to implement, scalable to producing arbitrarily many prediction intervals sequentially, and well-suited to a wide range of regression functions. We perform extensive real-data analyses to demonstrate its effectiveness.

Cite

Text

Xu and Xie. "Conformal Prediction Interval for Dynamic Time-Series." International Conference on Machine Learning, 2021.

Markdown

[Xu and Xie. "Conformal Prediction Interval for Dynamic Time-Series." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/xu2021icml-conformal/)

BibTeX

@inproceedings{xu2021icml-conformal,
  title     = {{Conformal Prediction Interval for Dynamic Time-Series}},
  author    = {Xu, Chen and Xie, Yao},
  booktitle = {International Conference on Machine Learning},
  year      = {2021},
  pages     = {11559-11569},
  volume    = {139},
  url       = {https://mlanthology.org/icml/2021/xu2021icml-conformal/}
}