Sequential Changepoint Detection via Backward Confidence Sequences

Abstract

We present a simple reduction from sequential estimation to sequential changepoint detection (SCD). In short, suppose we are interested in detecting changepoints in some parameter or functional $\theta$ of the underlying distribution. We demonstrate that if we can construct a confidence sequence (CS) for $\theta$, then we can also successfully perform SCD for $\theta$. This is accomplished by checking if two CSs — one forwards and the other backwards — ever fail to intersect. Since the literature on CSs has been rapidly evolving recently, the reduction provided in this paper immediately solves several old and new change detection problems. Further, our “backward CS”, constructed by reversing time, is new and potentially of independent interest. We provide strong nonasymptotic guarantees on the frequency of false alarms and detection delay, and demonstrate numerical effectiveness on several problems.

Cite

Text

Shekhar and Ramdas. "Sequential Changepoint Detection via Backward Confidence Sequences." International Conference on Machine Learning, 2023.

Markdown

[Shekhar and Ramdas. "Sequential Changepoint Detection via Backward Confidence Sequences." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/shekhar2023icml-sequential/)

BibTeX

@inproceedings{shekhar2023icml-sequential,
  title     = {{Sequential Changepoint Detection via Backward Confidence Sequences}},
  author    = {Shekhar, Shubhanshu and Ramdas, Aaditya},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {30908-30930},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/shekhar2023icml-sequential/}
}