An EM Algorithm on BDDs with Order Encoding for Logic-Based Probabilistic Models

Abstract

Logic-based probabilistic models (LBPMs) enable us to handle various problems in the real world thanks to the expressive power of logic. However, most of LBPMs have restrictions to realize efficient probability computation and learning. We propose an EM algorithm working on BDDs with order encoding for LBPMs. A notable advantage of our algorithm over existing approaches is that it copes with multi-valued random variables without restrictions. The complexity of our algorithm is proportional to the size of the BDD. In the case of hidden Markov models (HMMs), the complexity is the same as that specialized for HMMs. As an example to eliminate restrictions of existing approaches, we utilize our algorithm to give diagnoses for failure in a logic circuit involving stochastic error gates.

Cite

Text

Ishihata et al. "An EM Algorithm on BDDs with Order Encoding for Logic-Based Probabilistic Models." Proceedings of 2nd Asian Conference on Machine Learning, 2010.

Markdown

[Ishihata et al. "An EM Algorithm on BDDs with Order Encoding for Logic-Based Probabilistic Models." Proceedings of 2nd Asian Conference on Machine Learning, 2010.](https://mlanthology.org/acml/2010/ishihata2010acml-em/)

BibTeX

@inproceedings{ishihata2010acml-em,
  title     = {{An EM Algorithm on BDDs with Order Encoding for Logic-Based Probabilistic Models}},
  author    = {Ishihata, Masakazu and Kameya, Yoshitaka and Sato, Taisuke and Minato, Shin-ichi},
  booktitle = {Proceedings of 2nd Asian Conference on Machine Learning},
  year      = {2010},
  pages     = {161-176},
  volume    = {13},
  url       = {https://mlanthology.org/acml/2010/ishihata2010acml-em/}
}