Bias-Variance Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions

Abstract

Hierarchical probabilistic models are able to use a large number of parameters to create a model with a high representation power. However, it is well known that increasing the number of parameters also increases the complexity of the model which leads to a bias-variance trade-off. Although it is a classical problem, the bias-variance trade-off between hiddenlayers and higher-order interactions have not been well studied. In our study, we propose an efficient inference algorithm for the log-linear formulation of the higher-order Boltzmann machine using a combination of Gibbs sampling and annealed importance sampling. We then perform a bias-variance decomposition to study the differences in hidden layers and higher-order interactions. Our results have shown that using hidden layers and higher-order interactions have a comparable error with a similar order of magnitude and using higherorder interactions produce less variance for smaller sample size.

Cite

Text

Luo and Sugiyama. "Bias-Variance Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.33014488

Markdown

[Luo and Sugiyama. "Bias-Variance Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/luo2019aaai-bias/) doi:10.1609/AAAI.V33I01.33014488

BibTeX

@inproceedings{luo2019aaai-bias,
  title     = {{Bias-Variance Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions}},
  author    = {Luo, Simon and Sugiyama, Mahito},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2019},
  pages     = {4488-4495},
  doi       = {10.1609/AAAI.V33I01.33014488},
  url       = {https://mlanthology.org/aaai/2019/luo2019aaai-bias/}
}