Maximum Conditional Likelihood via Bound Maximization and the CEM Algorithm
Abstract
We present the CEM (Conditional Expectation Maximi::ation) al(cid:173) gorithm as an extension of the EM (Expectation M aximi::ation) algorithm to conditional density estimation under missing data. A bounding and maximization process is given to specifically optimize conditional likelihood instead of the usual joint likelihood. We ap(cid:173) ply the method to conditioned mixture models and use bounding techniques to derive the model's update rules . Monotonic conver(cid:173) gence, computational efficiency and regression results superior to EM are demonstrated.
Cite
Text
Jebara and Pentland. "Maximum Conditional Likelihood via Bound Maximization and the CEM Algorithm." Neural Information Processing Systems, 1998.Markdown
[Jebara and Pentland. "Maximum Conditional Likelihood via Bound Maximization and the CEM Algorithm." Neural Information Processing Systems, 1998.](https://mlanthology.org/neurips/1998/jebara1998neurips-maximum/)BibTeX
@inproceedings{jebara1998neurips-maximum,
title = {{Maximum Conditional Likelihood via Bound Maximization and the CEM Algorithm}},
author = {Jebara, Tony and Pentland, Alex},
booktitle = {Neural Information Processing Systems},
year = {1998},
pages = {494-500},
url = {https://mlanthology.org/neurips/1998/jebara1998neurips-maximum/}
}