Kernel Quadrature with DPPs
Abstract
We study quadrature rules for functions living in an RKHS, using nodes sampled from a projection determinantal point process (DPP). DPPs are parametrized by a kernel, and we use a truncated and saturated version of the RKHS kernel. This natural link between the two kernels, along with DPP machinery, leads to relatively tight bounds on the quadrature error, that depends on the spectrum of the RKHS kernel. Finally, we experimentally compare DPPs to existing kernel-based quadratures such as herding, Bayesian quadrature, or continuous leverage score sampling. Numerical results confirm the interest of DPPs, and even suggest faster rates than our bounds in particular cases.
Cite
Text
Belhadji et al. "Kernel Quadrature with DPPs." Neural Information Processing Systems, 2019.Markdown
[Belhadji et al. "Kernel Quadrature with DPPs." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/belhadji2019neurips-kernel/)BibTeX
@inproceedings{belhadji2019neurips-kernel,
title = {{Kernel Quadrature with DPPs}},
author = {Belhadji, Ayoub and Bardenet, Rémi and Chainais, Pierre},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {12927-12937},
url = {https://mlanthology.org/neurips/2019/belhadji2019neurips-kernel/}
}