Asymptotic Model Selection for Directed Networks with Hidden Variables
Abstract
We extend the Bayesian Information Criterion (BIC), an asymptotic approximation for tile marginal likelihood, to Bayesian networks with hidden variables. This approximation can be used to select models given large sampies of data. Tile standard BIC as well as our extension punishes the complexity of a model according to tile dimension of its parameters. We argue that the dimension of a Bayesian uetwork with hidden variables is tile rank of the Jacobian matrix of the transformation between the parameters of the network and the parameters of the observable variables. We compute the dimensions of several networks including the naive Bayes model with a hidden root node.
Cite
Text
Geiger et al. "Asymptotic Model Selection for Directed Networks with Hidden Variables." Conference on Uncertainty in Artificial Intelligence, 1996. doi:10.1007/978-94-011-5014-9_16Markdown
[Geiger et al. "Asymptotic Model Selection for Directed Networks with Hidden Variables." Conference on Uncertainty in Artificial Intelligence, 1996.](https://mlanthology.org/uai/1996/geiger1996uai-asymptotic/) doi:10.1007/978-94-011-5014-9_16BibTeX
@inproceedings{geiger1996uai-asymptotic,
title = {{Asymptotic Model Selection for Directed Networks with Hidden Variables}},
author = {Geiger, Dan and Heckerman, David and Meek, Christopher},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {1996},
pages = {283-290},
doi = {10.1007/978-94-011-5014-9_16},
url = {https://mlanthology.org/uai/1996/geiger1996uai-asymptotic/}
}