Discriminative Nonparametric Latent Feature Relational Models with Data Augmentation

Abstract

We present a discriminative nonparametric latent feature relational model (LFRM) for link prediction to automatically infer the dimensionality of latent features. Under the generic RegBayes (regularized Bayesian inference) framework, we handily incorporate the prediction loss with probabilistic inference of a Bayesian model; set distinct regularization parameters for different types of links to handle the imbalance issue in real networks; and unify the analysis of both the smooth logistic log-loss and the piecewise linear hinge loss. For the nonconjugate posterior inference, we present a simple Gibbs sampler via data augmentation, without making restricting assumptions as done in variational methods. We further develop an approximate sampler using stochastic gradient Langevin dynamics to handle large networks with hundreds of thousands of entities and millions of links, orders of magnitude larger than what existing LFRM models can process. Extensive studies on various real networks show promising performance.

Cite

Text

Chen et al. "Discriminative Nonparametric Latent Feature Relational Models with Data Augmentation." AAAI Conference on Artificial Intelligence, 2016. doi:10.1609/AAAI.V30I1.10162

Markdown

[Chen et al. "Discriminative Nonparametric Latent Feature Relational Models with Data Augmentation." AAAI Conference on Artificial Intelligence, 2016.](https://mlanthology.org/aaai/2016/chen2016aaai-discriminative/) doi:10.1609/AAAI.V30I1.10162

BibTeX

@inproceedings{chen2016aaai-discriminative,
  title     = {{Discriminative Nonparametric Latent Feature Relational Models with Data Augmentation}},
  author    = {Chen, Bei and Chen, Ning and Zhu, Jun and Song, Jiaming and Zhang, Bo},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2016},
  pages     = {1153-1159},
  doi       = {10.1609/AAAI.V30I1.10162},
  url       = {https://mlanthology.org/aaai/2016/chen2016aaai-discriminative/}
}