Fast Relational Probabilistic Inference and Learning: Approximate Counting via Hypergraphs
Abstract
Counting the number of true instances of a clause is arguably a major bottleneck in relational probabilistic inference and learning. We approximate counts in two steps: (1) transform the fully grounded relational model to a large hypergraph, and partially-instantiated clauses to hypergraph motifs; (2) since the expected counts of the motifs are provably the clause counts, approximate them using summary statistics (in/outdegrees, edge counts, etc). Our experimental results demonstrate the efficiency of these approximations, which can be applied to many complex statistical relational models, and can be significantly faster than state-of-the-art, both for inference and learning, without sacrificing effectiveness.
Cite
Text
Das et al. "Fast Relational Probabilistic Inference and Learning: Approximate Counting via Hypergraphs." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.33017816Markdown
[Das et al. "Fast Relational Probabilistic Inference and Learning: Approximate Counting via Hypergraphs." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/das2019aaai-fast/) doi:10.1609/AAAI.V33I01.33017816BibTeX
@inproceedings{das2019aaai-fast,
title = {{Fast Relational Probabilistic Inference and Learning: Approximate Counting via Hypergraphs}},
author = {Das, Mayukh and Dhami, Devendra Singh and Kunapuli, Gautam and Kersting, Kristian and Natarajan, Sriraam},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2019},
pages = {7816-7824},
doi = {10.1609/AAAI.V33I01.33017816},
url = {https://mlanthology.org/aaai/2019/das2019aaai-fast/}
}