Thin Junction Trees

Abstract

We present an algorithm that induces a class of models with thin junction trees—models that are characterized by an upper bound on the size of the maximal cliques of their triangulated graph. By ensuring that the junction tree is thin, inference in our models remains tractable throughout the learning process. This allows both an efficient implementation of an iterative scaling parameter estimation algorithm and also ensures that inference can be performed efficiently with the final model. We illustrate the approach with applications in handwritten digit recognition and DNA splice site detection.

Cite

Text

Bach and Jordan. "Thin Junction Trees." Neural Information Processing Systems, 2001.

Markdown

[Bach and Jordan. "Thin Junction Trees." Neural Information Processing Systems, 2001.](https://mlanthology.org/neurips/2001/bach2001neurips-thin/)

BibTeX

@inproceedings{bach2001neurips-thin,
  title     = {{Thin Junction Trees}},
  author    = {Bach, Francis R. and Jordan, Michael I.},
  booktitle = {Neural Information Processing Systems},
  year      = {2001},
  pages     = {569-576},
  url       = {https://mlanthology.org/neurips/2001/bach2001neurips-thin/}
}