Age-Layered Expectation Maximization for Parameter Learning in Bayesian Networks

Abstract

The expectation maximization (EM) algorithm is a popular algorithm for parameter estimation in models with hidden variables. However, the algorithm has several non-trivial limitations, a significant one being variation in eventual solutions found, due to convergence to local optima. Several techniques have been proposed to allay this problem, for example initializing EM from multiple random starting points and selecting the highest likelihood out of all runs. In this work, we a) show that this method can be very expensive computationally for difficult Bayesian networks, and b) in response we propose an age-layered EM approach (ALEM) that efficiently discards less promising runs well before convergence. Our experiments show a significant reduction in the number of iterations, typically two- to four-fold, with minimal or no reduction in solution quality, indicating the potential for ALEM to streamline parameter estimation in Bayesian networks.

Cite

Text

Saluja et al. "Age-Layered Expectation Maximization for Parameter Learning in Bayesian Networks." Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, 2012.

Markdown

[Saluja et al. "Age-Layered Expectation Maximization for Parameter Learning in Bayesian Networks." Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, 2012.](https://mlanthology.org/aistats/2012/saluja2012aistats-agelayered/)

BibTeX

@inproceedings{saluja2012aistats-agelayered,
  title     = {{Age-Layered Expectation Maximization for Parameter Learning in Bayesian Networks}},
  author    = {Saluja, Avneesh and Sundararajan, Priya Krishnan and Mengshoel, Ole J},
  booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics},
  year      = {2012},
  pages     = {984-992},
  volume    = {22},
  url       = {https://mlanthology.org/aistats/2012/saluja2012aistats-agelayered/}
}