Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood
Abstract
Factorized information criterion (FIC) is a recently developed approximation technique for the marginal log-likelihood, which provides an automatic model selection framework for a few latent variable models (LVMs) with tractable inference algorithms. This paper reconsiders FIC and fills theoretical gaps of previous FIC studies. First, we reveal the core idea of FIC that allows generalization for a broader class of LVMs, including continuous LVMs, in contrast to previous FICs, which are applicable only to binary LVMs. Second, we investigate the model selection mechanism of the generalized FIC. Our analysis provides a formal justification of FIC as a model selection criterion for LVMs and also a systematic procedure for pruning redundant latent variables that have been removed heuristically in previous studies. Third, we provide an interpretation of FIC as a variational free energy and uncover previously-unknown their relationship. A demonstrative study on Bayesian principal component analysis is provided and numerical experiments support our theoretical results.
Cite
Text
Hayashi et al. "Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood." International Conference on Machine Learning, 2015.Markdown
[Hayashi et al. "Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood." International Conference on Machine Learning, 2015.](https://mlanthology.org/icml/2015/hayashi2015icml-rebuilding/)BibTeX
@inproceedings{hayashi2015icml-rebuilding,
title = {{Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood}},
author = {Hayashi, Kohei and Maeda, Shin-ichi and Fujimaki, Ryohei},
booktitle = {International Conference on Machine Learning},
year = {2015},
pages = {1358-1366},
volume = {37},
url = {https://mlanthology.org/icml/2015/hayashi2015icml-rebuilding/}
}