Algorithms for Independent Components Analysis and Higher Order Statistics

Abstract

A latent variable generative model with finite noise is used to de(cid:173) scribe several different algorithms for Independent Components Anal(cid:173) ysis (lCA). In particular, the Fixed Point ICA algorithm is shown to be equivalent to the Expectation-Maximization algorithm for maximum likelihood under certain constraints, allowing the conditions for global convergence to be elucidated. The algorithms can also be explained by their generic behavior near a singular point where the size of the opti(cid:173) mal generative bases vanishes. An expansion of the likelihood about this singular point indicates the role of higher order correlations in determin(cid:173) ing the features discovered by ICA. The application and convergence of these algorithms are demonstrated on a simple illustrative example.

Cite

Text

Lee et al. "Algorithms for Independent Components Analysis and Higher Order Statistics." Neural Information Processing Systems, 1999.

Markdown

[Lee et al. "Algorithms for Independent Components Analysis and Higher Order Statistics." Neural Information Processing Systems, 1999.](https://mlanthology.org/neurips/1999/lee1999neurips-algorithms/)

BibTeX

@inproceedings{lee1999neurips-algorithms,
  title     = {{Algorithms for Independent Components Analysis and Higher Order Statistics}},
  author    = {Lee, Daniel D. and Rokni, Uri and Sompolinsky, Haim},
  booktitle = {Neural Information Processing Systems},
  year      = {1999},
  pages     = {491-497},
  url       = {https://mlanthology.org/neurips/1999/lee1999neurips-algorithms/}
}