Sequential Bayesian Prediction in the Presence of Changepoints

Abstract

We introduce a new sequential algorithm for making robust predictions in the presence of changepoints. Unlike previous approaches, which focus on the problem of detecting and locating changepoints, our algorithm focuses on the problem of making predictions even when such changes might be present. We introduce nonstationary covariance functions to be used in Gaussian process prediction that model such changes, then proceed to demonstrate how to effectively manage the hyperparameters associated with those covariance functions. By using Bayesian quadrature, we can integrate out the hyperparameters, allowing us to calculate the marginal predictive distribution. Furthermore, if desired, the posterior distribution over putative changepoint locations can be calculated as a natural byproduct of our prediction algorithm.

Cite

Text

Garnett et al. "Sequential Bayesian Prediction in the Presence of Changepoints." International Conference on Machine Learning, 2009. doi:10.1145/1553374.1553418

Markdown

[Garnett et al. "Sequential Bayesian Prediction in the Presence of Changepoints." International Conference on Machine Learning, 2009.](https://mlanthology.org/icml/2009/garnett2009icml-sequential/) doi:10.1145/1553374.1553418

BibTeX

@inproceedings{garnett2009icml-sequential,
  title     = {{Sequential Bayesian Prediction in the Presence of Changepoints}},
  author    = {Garnett, Roman and Osborne, Michael A. and Roberts, Stephen J.},
  booktitle = {International Conference on Machine Learning},
  year      = {2009},
  pages     = {345-352},
  doi       = {10.1145/1553374.1553418},
  url       = {https://mlanthology.org/icml/2009/garnett2009icml-sequential/}
}