An Adaptive Test of Independence with Analytic Kernel Embeddings

Abstract

A new computationally efficient dependence measure, and an adaptive statistical test of independence, are proposed. The dependence measure is the difference between analytic embeddings of the joint distribution and the product of the marginals, evaluated at a finite set of locations (features). These features are chosen so as to maximize a lower bound on the test power, resulting in a test that is data-efficient, and that runs in linear time (with respect to the sample size n). The optimized features can be interpreted as evidence to reject the null hypothesis, indicating regions in the joint domain where the joint distribution and the product of the marginals differ most. Consistency of the independence test is established, for an appropriate choice of features. In real-world benchmarks, independence tests using the optimized features perform comparably to the state-of-the-art quadratic-time HSIC test, and outperform competing O(n) and O(n log n) tests.

Cite

Text

Jitkrittum et al. "An Adaptive Test of Independence with Analytic Kernel Embeddings." International Conference on Machine Learning, 2017.

Markdown

[Jitkrittum et al. "An Adaptive Test of Independence with Analytic Kernel Embeddings." International Conference on Machine Learning, 2017.](https://mlanthology.org/icml/2017/jitkrittum2017icml-adaptive/)

BibTeX

@inproceedings{jitkrittum2017icml-adaptive,
  title     = {{An Adaptive Test of Independence with Analytic Kernel Embeddings}},
  author    = {Jitkrittum, Wittawat and Szabó, Zoltán and Gretton, Arthur},
  booktitle = {International Conference on Machine Learning},
  year      = {2017},
  pages     = {1742-1751},
  volume    = {70},
  url       = {https://mlanthology.org/icml/2017/jitkrittum2017icml-adaptive/}
}