Faster Algorithms for Testing Under Conditional Sampling
Abstract
There has been considerable recent interest in distribution-tests whose run-time and sample requirements are sublinear in the domain-size $k$. We study two of the most important tests under the conditional-sampling model where each query specifies a subset $S$ of the domain, and the response is a sample drawn from $S$ according to the underlying distribution. For identity testing, which asks whether the underlying distribution equals a specific given distribution or $\epsilon$-differs from it, we reduce the known time and sample complexities from $\tilde{\mathcal{O}}(\epsilon^{-4})$ to $\tilde{\mathcal{O}}(\epsilon^{-2})$, thereby matching the information theoretic lower bound. For closeness testing, which asks whether two distributions underlying observed data sets are equal or different, we reduce existing complexity from $\tilde{\mathcal{O}}(\epsilon^{-4} \log^5 k)$ to an even sub-logarithmic $\tilde{\mathcal{O}}(\epsilon^{-5} \log \log k)$ thus providing a better bound to an open problem in Bertinoro Workshop on Sublinear Algorithms [Fisher, 2004].
Cite
Text
Falahatgar et al. "Faster Algorithms for Testing Under Conditional Sampling." Annual Conference on Computational Learning Theory, 2015.Markdown
[Falahatgar et al. "Faster Algorithms for Testing Under Conditional Sampling." Annual Conference on Computational Learning Theory, 2015.](https://mlanthology.org/colt/2015/falahatgar2015colt-faster/)BibTeX
@inproceedings{falahatgar2015colt-faster,
title = {{Faster Algorithms for Testing Under Conditional Sampling}},
author = {Falahatgar, Moein and Jafarpour, Ashkan and Orlitsky, Alon and Pichapati, Venkatadheeraj and Suresh, Ananda Theertha},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2015},
pages = {607-636},
url = {https://mlanthology.org/colt/2015/falahatgar2015colt-faster/}
}