Variational Inference with Censored Gaussian Process Regressors

Abstract

We consider the problem of Bayesian inference when some observations have been censored. In censored data, the dependent variable has been clipped, so we only know that the true value is at least as large (or as small) as the observation. Such data can be modeled using a Tobit likelihood, which can be viewed as a mixture between a normal distribution restricted on the domain without censoring treatment and a point mass at the boundary. This requires careful consideration when evaluating information-theoretic quantities, due to the mixed continuous and discrete probability measures. We introduce a novel approximate inference scheme for Gaussian process models with a Tobit likelihood, derive interpretable analytic expression for the Gaussian process evidence lower bound (ELBO) and demonstrate the resulting model's efficiency in learning Gaussian process posteriors for censored data relative to uncensored case.

Cite

Text

Karlova et al. "Variational Inference with Censored Gaussian Process Regressors." ICML 2024 Workshops: SPIGM, 2024.

Markdown

[Karlova et al. "Variational Inference with Censored Gaussian Process Regressors." ICML 2024 Workshops: SPIGM, 2024.](https://mlanthology.org/icmlw/2024/karlova2024icmlw-variational/)

BibTeX

@inproceedings{karlova2024icmlw-variational,
  title     = {{Variational Inference with Censored Gaussian Process Regressors}},
  author    = {Karlova, Andrea and Kabra, Rishabh and de Souza, Daniel Augusto and Paige, Brooks},
  booktitle = {ICML 2024 Workshops: SPIGM},
  year      = {2024},
  url       = {https://mlanthology.org/icmlw/2024/karlova2024icmlw-variational/}
}