Testing Determinantal Point Processes

Abstract

Determinantal point processes (DPPs) are popular probabilistic models of diversity. In this paper, we investigate DPPs from a new perspective: property testing of distributions. Given sample access to an unknown distribution $q$ over the subsets of a ground set, we aim to distinguish whether $q$ is a DPP distribution or $\epsilon$-far from all DPP distributions in $\ell_1$-distance. In this work, we propose the first algorithm for testing DPPs. Furthermore, we establish a matching lower bound on the sample complexity of DPP testing. This lower bound also extends to showing a new hardness result for the problem of testing the more general class of log-submodular distributions.

Cite

Text

Gatmiry et al. "Testing Determinantal Point Processes." Neural Information Processing Systems, 2020.

Markdown

[Gatmiry et al. "Testing Determinantal Point Processes." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/gatmiry2020neurips-testing/)

BibTeX

@inproceedings{gatmiry2020neurips-testing,
  title     = {{Testing Determinantal Point Processes}},
  author    = {Gatmiry, Khashayar and Aliakbarpour, Maryam and Jegelka, Stefanie},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/gatmiry2020neurips-testing/}
}