Naive Bayes Models for Probability Estimation
Abstract
Naive Bayes models have been widely used for clustering and classification. However, they are seldom used for general probabilistic learning and inference (i.e., for estimating and computing arbitrary joint, conditional and marginal distributions). In this paper we show that, for a wide range of benchmark datasets, naive Bayes models learned using EM have accuracy and learning time comparable to Bayesian networks with context-specific independence. Most significantly, naive Bayes inference is orders of magnitude faster than Bayesian network inference using Gibbs sampling and belief propagation. This makes naive Bayes models a very attractive alternative to Bayesian networks for general probability estimation, particularly in large or real-time domains.
Cite
Text
Lowd and Domingos. "Naive Bayes Models for Probability Estimation." International Conference on Machine Learning, 2005. doi:10.1145/1102351.1102418Markdown
[Lowd and Domingos. "Naive Bayes Models for Probability Estimation." International Conference on Machine Learning, 2005.](https://mlanthology.org/icml/2005/lowd2005icml-naive/) doi:10.1145/1102351.1102418BibTeX
@inproceedings{lowd2005icml-naive,
title = {{Naive Bayes Models for Probability Estimation}},
author = {Lowd, Daniel and Domingos, Pedro M.},
booktitle = {International Conference on Machine Learning},
year = {2005},
pages = {529-536},
doi = {10.1145/1102351.1102418},
url = {https://mlanthology.org/icml/2005/lowd2005icml-naive/}
}