Efficient Learning of Smooth Probability Functions from Bernoulli Tests with Guarantees

Abstract

We study the fundamental problem of learning an unknown, smooth probability function via point-wise Bernoulli tests. We provide a scalable algorithm for efficiently solving this problem with rigorous guarantees. In particular, we prove the convergence rate of our posterior update rule to the true probability function in L2-norm. Moreover, we allow the Bernoulli tests to depend on contextual features, and provide a modified inference engine with provable guarantees for this novel setting. Numerical results show that the empirical convergence rates match the theory, and illustrate the superiority of our approach in handling contextual features over the state-of-the-art.

Cite

Text

Rolland et al. "Efficient Learning of Smooth Probability Functions from Bernoulli Tests with Guarantees." International Conference on Machine Learning, 2019.

Markdown

[Rolland et al. "Efficient Learning of Smooth Probability Functions from Bernoulli Tests with Guarantees." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/rolland2019icml-efficient/)

BibTeX

@inproceedings{rolland2019icml-efficient,
  title     = {{Efficient Learning of Smooth Probability Functions from Bernoulli Tests with Guarantees}},
  author    = {Rolland, Paul and Kavis, Ali and Immer, Alexander and Singla, Adish and Cevher, Volkan},
  booktitle = {International Conference on Machine Learning},
  year      = {2019},
  pages     = {5459-5467},
  volume    = {97},
  url       = {https://mlanthology.org/icml/2019/rolland2019icml-efficient/}
}