Efficient Online Inference for Bayesian Nonparametric Relational Models
Abstract
Stochastic block models characterize observed network relationships via latent community memberships. In large social networks, we expect entities to participate in multiple communities, and the number of communities to grow with the network size. We introduce a new model for these phenomena, the hierarchical Dirichlet process relational model, which allows nodes to have mixed membership in an unbounded set of communities. To allow scalable learning, we derive an online stochastic variational inference algorithm. Focusing on assortative models of undirected networks, we also propose an efficient structured mean field variational bound, and online methods for automatically pruning unused communities. Compared to state-of-the-art online learning methods for parametric relational models, we show significantly improved perplexity and link prediction accuracy for sparse networks with tens of thousands of nodes. We also showcase an analysis of LittleSis, a large network of who-knows-who at the heights of business and government.
Cite
Text
Kim et al. "Efficient Online Inference for Bayesian Nonparametric Relational Models." Neural Information Processing Systems, 2013.Markdown
[Kim et al. "Efficient Online Inference for Bayesian Nonparametric Relational Models." Neural Information Processing Systems, 2013.](https://mlanthology.org/neurips/2013/kim2013neurips-efficient/)BibTeX
@inproceedings{kim2013neurips-efficient,
title = {{Efficient Online Inference for Bayesian Nonparametric Relational Models}},
author = {Kim, Dae Il and Gopalan, Prem and Blei, David and Sudderth, Erik},
booktitle = {Neural Information Processing Systems},
year = {2013},
pages = {962-970},
url = {https://mlanthology.org/neurips/2013/kim2013neurips-efficient/}
}