Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians

Abstract

Mixtures of high dimensional Gaussian distributions have been studied extensively in statistics and learning theory. While the total variation distance appears naturally in the sample complexity of distribution learning, it is analytically difficult to obtain tight lower bounds for mixtures. Exploiting a connection between total variation distance and the characteristic function of the mixture, we provide fairly tight functional approximations. This enables us to derive new lower bounds on the total variation distance between two-component Gaussian mixtures with a shared covariance matrix.

Cite

Text

Davies et al. "Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians." Proceedings of The 33rd International Conference on Algorithmic Learning Theory, 2022.

Markdown

[Davies et al. "Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians." Proceedings of The 33rd International Conference on Algorithmic Learning Theory, 2022.](https://mlanthology.org/alt/2022/davies2022alt-lower/)

BibTeX

@inproceedings{davies2022alt-lower,
  title     = {{Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians}},
  author    = {Davies, Sami and Mazumdar, Arya and Pal, Soumyabrata and Rashtchian, Cyrus},
  booktitle = {Proceedings of The 33rd International Conference on Algorithmic Learning Theory},
  year      = {2022},
  pages     = {319-341},
  volume    = {167},
  url       = {https://mlanthology.org/alt/2022/davies2022alt-lower/}
}