Bayesian Unsupervised Learning of Higher Order Structure

Abstract

Multilayer architectures such as those used in Bayesian belief net(cid:173) works and Helmholtz machines provide a powerful framework for representing and learning higher order statistical relations among inputs. Because exact probability calculations with these mod(cid:173) els are often intractable, there is much interest in finding approxi(cid:173) mate algorithms. We present an algorithm that efficiently discovers higher order structure using EM and Gibbs sampling. The model can be interpreted as a stochastic recurrent network in which ambi(cid:173) guity in lower-level states is resolved through feedback from higher levels. We demonstrate the performance of the algorithm on bench(cid:173) mark problems.

Cite

Text

Lewicki and Sejnowski. "Bayesian Unsupervised Learning of Higher Order Structure." Neural Information Processing Systems, 1996.

Markdown

[Lewicki and Sejnowski. "Bayesian Unsupervised Learning of Higher Order Structure." Neural Information Processing Systems, 1996.](https://mlanthology.org/neurips/1996/lewicki1996neurips-bayesian/)

BibTeX

@inproceedings{lewicki1996neurips-bayesian,
  title     = {{Bayesian Unsupervised Learning of Higher Order Structure}},
  author    = {Lewicki, Michael S. and Sejnowski, Terrence J.},
  booktitle = {Neural Information Processing Systems},
  year      = {1996},
  pages     = {529-535},
  url       = {https://mlanthology.org/neurips/1996/lewicki1996neurips-bayesian/}
}