Dependence, Correlation and Gaussianity in Independent Component Analysis

Abstract

Independent component analysis (ICA) is the decomposition of a random vector in linear components which are "as independent as possible." Here, "independence" should be understood in its strong statistical sense: it goes beyond (second-order) decorrelation and thus involves the non-Gaussianity of the data. The ideal measure of independence is the "mutual information" and is known to be related to the entropy of the components when the search for components is restricted to uncorrelated components. This paper explores the connections between mutual information, entropy and non-Gaussianity in a larger framework, without resorting to a somewhat arbitrary decorrelation constraint. A key result is that the mutual information can be decomposed, under linear transforms, as the sum of two terms: one term expressing the decorrelation of the components and one expressing their non-Gaussianity.

Cite

Text

Cardoso. "Dependence, Correlation and  Gaussianity in Independent Component Analysis." Journal of Machine Learning Research, 2003.

Markdown

[Cardoso. "Dependence, Correlation and  Gaussianity in Independent Component Analysis." Journal of Machine Learning Research, 2003.](https://mlanthology.org/jmlr/2003/cardoso2003jmlr-dependence/)

BibTeX

@article{cardoso2003jmlr-dependence,
  title     = {{Dependence, Correlation and  Gaussianity in Independent Component Analysis}},
  author    = {Cardoso, Jean-François},
  journal   = {Journal of Machine Learning Research},
  year      = {2003},
  pages     = {1177-1203},
  volume    = {4},
  url       = {https://mlanthology.org/jmlr/2003/cardoso2003jmlr-dependence/}
}