Bootstrapping Neural Processes

Abstract

Unlike in the traditional statistical modeling for which a user typically hand-specify a prior, Neural Processes (NPs) implicitly define a broad class of stochastic processes with neural networks. Given a data stream, NP learns a stochastic process that best describes the data. While this ``data-driven'' way of learning stochastic processes has proven to handle various types of data, NPs still relies on an assumption that uncertainty in stochastic processes is modeled by a single latent variable, which potentially limits the flexibility. To this end, we propose the Bootstrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap. The bootstrap is a classical data-driven technique for estimating uncertainty, which allows BNP to learn the stochasticity in NPs without assuming a particular form. We demonstrate the efficacy of BNP on various types of data and its robustness in the presence of model-data mismatch.

Cite

Text

Lee et al. "Bootstrapping Neural Processes." Neural Information Processing Systems, 2020.

Markdown

[Lee et al. "Bootstrapping Neural Processes." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/lee2020neurips-bootstrapping/)

BibTeX

@inproceedings{lee2020neurips-bootstrapping,
  title     = {{Bootstrapping Neural Processes}},
  author    = {Lee, Juho and Lee, Yoonho and Kim, Jungtaek and Yang, Eunho and Hwang, Sung Ju and Teh, Yee Whye},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/lee2020neurips-bootstrapping/}
}