Sample Space Truncation on Boltzmann Machines
Abstract
We present a lightweight variant of Boltzmann machines via sample space truncation, called a truncated Boltzmann machine (TBM), which has not been investigated before while can be naturally introduced from the log-linear model viewpoint. TBMs can alleviate the massive computational cost of exact training of Boltzmann machines that requires exponential time evaluation of expected values and the partition function of the model distribution. To analyze the learnability of TBMs, we theoretically provide bias-variance decomposition of the log-linear model using dually flat structure of statistical manifolds.
Cite
Text
Sugiyama et al. "Sample Space Truncation on Boltzmann Machines." NeurIPS 2020 Workshops: DL-IG, 2020.Markdown
[Sugiyama et al. "Sample Space Truncation on Boltzmann Machines." NeurIPS 2020 Workshops: DL-IG, 2020.](https://mlanthology.org/neuripsw/2020/sugiyama2020neuripsw-sample/)BibTeX
@inproceedings{sugiyama2020neuripsw-sample,
title = {{Sample Space Truncation on Boltzmann Machines}},
author = {Sugiyama, Mahito and Tsuda, Koji and Nakahara, Hiroyuki},
booktitle = {NeurIPS 2020 Workshops: DL-IG},
year = {2020},
url = {https://mlanthology.org/neuripsw/2020/sugiyama2020neuripsw-sample/}
}