Boltzmann Machine Learning with the Latent Maximum Entropy Principle
Abstract
We present a new statistical learning paradigm for Boltzmann machines based on a new inference principle we have proposed: the latent maximum entropy principle (LME). LME is different both from Jaynes' maximum entropy principle and from standard maximum likelihood estimation. We demonstrate the LME principle by deriving new algorithms for Boltzmann machine parameter estimation, and show how a robust and rapidly convergent new variant of the EM algorithm can be developed. Our experiments show that estimation based on LME generally yields better results than maximum likelihood estimation when inferring models from small amounts of data.
Cite
Text
Wang et al. "Boltzmann Machine Learning with the Latent Maximum Entropy Principle." Conference on Uncertainty in Artificial Intelligence, 2003.Markdown
[Wang et al. "Boltzmann Machine Learning with the Latent Maximum Entropy Principle." Conference on Uncertainty in Artificial Intelligence, 2003.](https://mlanthology.org/uai/2003/wang2003uai-boltzmann/)BibTeX
@inproceedings{wang2003uai-boltzmann,
title = {{Boltzmann Machine Learning with the Latent Maximum Entropy Principle}},
author = {Wang, Shaojun and Schuurmans, Dale and Peng, Fuchun and Zhao, Yunxin},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2003},
pages = {567-574},
url = {https://mlanthology.org/uai/2003/wang2003uai-boltzmann/}
}