Geometric Rates of Convergence for Kernel-Based Sampling Algorithms

Abstract

The rate of convergence of weighted kernel herding (WKH) and sequential Bayesian quadrature (SBQ), two kernel-based sampling algorithms for estimating integrals with respect to some target probability measure, is investigated. Under verifiable conditions on the chosen kernel and target measure, we establish a near-geometric rate of convergence for target measures that are nearly atomic. Furthermore, we show these algorithms perform comparably to the theoretical best possible sampling algorithm under the maximum mean discrepancy. An analysis is also conducted in a distributed setting. Our theoretical developments are supported by empirical observations on simulated data as well as a real world application.

Cite

Text

Khanna et al. "Geometric Rates of Convergence for Kernel-Based Sampling Algorithms." Uncertainty in Artificial Intelligence, 2021.

Markdown

[Khanna et al. "Geometric Rates of Convergence for Kernel-Based Sampling Algorithms." Uncertainty in Artificial Intelligence, 2021.](https://mlanthology.org/uai/2021/khanna2021uai-geometric/)

BibTeX

@inproceedings{khanna2021uai-geometric,
  title     = {{Geometric Rates of Convergence for Kernel-Based Sampling Algorithms}},
  author    = {Khanna, Rajiv and Hodgkinson, Liam and Mahoney, Michael W.},
  booktitle = {Uncertainty in Artificial Intelligence},
  year      = {2021},
  pages     = {2156-2164},
  volume    = {161},
  url       = {https://mlanthology.org/uai/2021/khanna2021uai-geometric/}
}