Products of Gaussians and Probabilistic Minor Component Analysis

Abstract

Recently, Hinton introduced the products of experts architecture for density estimation, where individual expert probabilities are multiplied and renormalized. We consider products of gaussian “pancakes” equally elongated in all directions except one and prove that the maximum likelihood solution for the model gives rise to a minor component analysis solution. We also discuss the covariance structure of sums and products of gaussian pancakes or one-factor probabilistic principal component analysis models.

Cite

Text

Williams and Agakov. "Products of Gaussians and Probabilistic Minor Component Analysis." Neural Computation, 2002. doi:10.1162/089976602753633439

Markdown

[Williams and Agakov. "Products of Gaussians and Probabilistic Minor Component Analysis." Neural Computation, 2002.](https://mlanthology.org/neco/2002/williams2002neco-products/) doi:10.1162/089976602753633439

BibTeX

@article{williams2002neco-products,
  title     = {{Products of Gaussians and Probabilistic Minor Component Analysis}},
  author    = {Williams, Christopher K. I. and Agakov, Felix V.},
  journal   = {Neural Computation},
  year      = {2002},
  pages     = {1169-1182},
  doi       = {10.1162/089976602753633439},
  volume    = {14},
  url       = {https://mlanthology.org/neco/2002/williams2002neco-products/}
}