On Approximating Total Variation Distance
Abstract
Total variation distance (TV distance) is a fundamental notion of distance between probability distributions. In this work, we introduce and study the problem of computing the TV distance of two product distributions over the domain 0,1^n. In particular, we establish the following results. 1. The problem of exactly computing the TV distance of two product distributions is #P-complete. This is in stark contrast with other distance measures such as KL, Chi-square, and Hellinger which tensorize over the marginals leading to efficient algorithms. 2. There is a fully polynomial-time deterministic approximation scheme (FPTAS) for computing the TV distance of two product distributions P and Q where Q is the uniform distribution. This result is extended to the case where Q has a constant number of distinct marginals. In contrast, we show that when P and Q are Bayes net distributions the relative approximation of their TV distance is NP-hard.
Cite
Text
Bhattacharyya et al. "On Approximating Total Variation Distance." International Joint Conference on Artificial Intelligence, 2023. doi:10.24963/IJCAI.2023/387Markdown
[Bhattacharyya et al. "On Approximating Total Variation Distance." International Joint Conference on Artificial Intelligence, 2023.](https://mlanthology.org/ijcai/2023/bhattacharyya2023ijcai-approximating/) doi:10.24963/IJCAI.2023/387BibTeX
@inproceedings{bhattacharyya2023ijcai-approximating,
title = {{On Approximating Total Variation Distance}},
author = {Bhattacharyya, Arnab and Gayen, Sutanu and Meel, Kuldeep S. and Myrisiotis, Dimitrios and Pavan, A. and Vinodchandran, N. V.},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2023},
pages = {3479-3487},
doi = {10.24963/IJCAI.2023/387},
url = {https://mlanthology.org/ijcai/2023/bhattacharyya2023ijcai-approximating/}
}