Doubly Robust Bayesian Inference for Non-Stationary Streaming Data with $\beta$-Divergences
Abstract
We present the very first robust Bayesian Online Changepoint Detection algorithm through General Bayesian Inference (GBI) with $\beta$-divergences. The resulting inference procedure is doubly robust for both the predictive and the changepoint (CP) posterior, with linear time and constant space complexity. We provide a construction for exponential models and demonstrate it on the Bayesian Linear Regression model. In so doing, we make two additional contributions: Firstly, we make GBI scalable using Structural Variational approximations that are exact as $\beta \to 0$. Secondly, we give a principled way of choosing the divergence parameter $\beta$ by minimizing expected predictive loss on-line. Reducing False Discovery Rates of \CPs from up to 99\% to 0\% on real world data, this offers the state of the art.
Cite
Text
Knoblauch et al. "Doubly Robust Bayesian Inference for Non-Stationary Streaming Data with $\beta$-Divergences." Neural Information Processing Systems, 2018.Markdown
[Knoblauch et al. "Doubly Robust Bayesian Inference for Non-Stationary Streaming Data with $\beta$-Divergences." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/knoblauch2018neurips-doubly/)BibTeX
@inproceedings{knoblauch2018neurips-doubly,
title = {{Doubly Robust Bayesian Inference for Non-Stationary Streaming Data with $\beta$-Divergences}},
author = {Knoblauch, Jeremias and Jewson, Jack E and Damoulas, Theodoros},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {64-75},
url = {https://mlanthology.org/neurips/2018/knoblauch2018neurips-doubly/}
}