Latent Dependency Forest Models

Abstract

Probabilistic modeling is one of the foundations of modern machine learning and artificial intelligence. In this paper, we propose a novel type of probabilistic models named latent dependency forest models (LDFMs). A LDFM models the dependencies between random variables with a forest structure that can change dynamically based on the variable values. It is therefore capable of modeling context-specific independence. We parameterize a LDFM using a first-order non-projective dependency grammar. Learning LDFMs from data can be formulated purely as a parameter learning problem, and hence the difficult problem of model structure learning is circumvented. Our experimental results show that LDFMs are competitive with existing probabilistic models.

Cite

Text

Chu et al. "Latent Dependency Forest Models." AAAI Conference on Artificial Intelligence, 2017. doi:10.1609/AAAI.V31I1.11047

Markdown

[Chu et al. "Latent Dependency Forest Models." AAAI Conference on Artificial Intelligence, 2017.](https://mlanthology.org/aaai/2017/chu2017aaai-latent/) doi:10.1609/AAAI.V31I1.11047

BibTeX

@inproceedings{chu2017aaai-latent,
  title     = {{Latent Dependency Forest Models}},
  author    = {Chu, Shanbo and Jiang, Yong and Tu, Kewei},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {3733-3739},
  doi       = {10.1609/AAAI.V31I1.11047},
  url       = {https://mlanthology.org/aaai/2017/chu2017aaai-latent/}
}