Copula-Based Kernel Dependency Measures

Abstract

The paper presents a new copula based method for measuring dependence between random variables. Our approach extends the Maximum Mean Discrepancy to the copula of the joint distribution. We prove that this approach has several advantageous properties. Similarly to Shannon mutual information, the proposed dependence measure is invariant to any strictly increasing transformation of the marginal variables. This is important in many applications, for example in feature selection. The estimator is consistent, robust to outliers, and uses rank statistics only. We derive upper bounds on the convergence rate and propose independence tests too. We illustrate the theoretical contributions through a series of experiments in feature selection and low-dimensional embedding of distributions.

Cite

Text

Póczos et al. "Copula-Based Kernel Dependency Measures." International Conference on Machine Learning, 2012.

Markdown

[Póczos et al. "Copula-Based Kernel Dependency Measures." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/poczos2012icml-copula/)

BibTeX

@inproceedings{poczos2012icml-copula,
  title     = {{Copula-Based Kernel Dependency Measures}},
  author    = {Póczos, Barnabás and Ghahramani, Zoubin and Schneider, Jeff G.},
  booktitle = {International Conference on Machine Learning},
  year      = {2012},
  url       = {https://mlanthology.org/icml/2012/poczos2012icml-copula/}
}