The Nonnegative Boltzmann Machine
Abstract
The nonnegative Boltzmann machine (NNBM) is a recurrent neural net(cid:173) work model that can describe multimodal nonnegative data. Application of maximum likelihood estimation to this model gives a learning rule that is analogous to the binary Boltzmann machine. We examine the utility of the mean field approximation for the NNBM, and describe how Monte Carlo sampling techniques can be used to learn its parameters. Reflec(cid:173) tive slice sampling is particularly well-suited for this distribution, and can efficiently be implemented to sample the distribution. We illustrate learning of the NNBM on a transiationally invariant distribution, as well as on a generative model for images of human faces.
Cite
Text
Downs et al. "The Nonnegative Boltzmann Machine." Neural Information Processing Systems, 1999.Markdown
[Downs et al. "The Nonnegative Boltzmann Machine." Neural Information Processing Systems, 1999.](https://mlanthology.org/neurips/1999/downs1999neurips-nonnegative/)BibTeX
@inproceedings{downs1999neurips-nonnegative,
title = {{The Nonnegative Boltzmann Machine}},
author = {Downs, Oliver B. and MacKay, David J. C. and Lee, Daniel D.},
booktitle = {Neural Information Processing Systems},
year = {1999},
pages = {428-434},
url = {https://mlanthology.org/neurips/1999/downs1999neurips-nonnegative/}
}