A Constrained EM Algorithm for Independent Component Analysis

Abstract

We introduce a novel way of performing independent component analysis using a constrained version of the expectation-maximization (EM) algorithm. The source distributions are modeled as D one-dimensional mixtures of gaussians. The observed data are modeled as linear mixtures of the sources with additive, isotropic noise. This generative model is fit to the data using constrained EM. The simpler “soft-switching” approach is introduced, which uses only one parameter to decide on the sub- or supergaussian nature of the sources. We explain how our approach relates to independent factor analysis.

Cite

Text

Welling and Weber. "A Constrained EM Algorithm for Independent Component Analysis." Neural Computation, 2001. doi:10.1162/089976601300014510

Markdown

[Welling and Weber. "A Constrained EM Algorithm for Independent Component Analysis." Neural Computation, 2001.](https://mlanthology.org/neco/2001/welling2001neco-constrained/) doi:10.1162/089976601300014510

BibTeX

@article{welling2001neco-constrained,
  title     = {{A Constrained EM Algorithm for Independent Component Analysis}},
  author    = {Welling, Max and Weber, Markus},
  journal   = {Neural Computation},
  year      = {2001},
  pages     = {677-689},
  doi       = {10.1162/089976601300014510},
  volume    = {13},
  url       = {https://mlanthology.org/neco/2001/welling2001neco-constrained/}
}