Non-Parametric Domain Approximation for Scalable Gibbs Sampling in MLNs
Abstract
MLNs utilize relational structures that are ubiquitous in real-world situations to represent large probabilistic graphical models compactly. However, as is now well-known, inference complexity is one of the main bottlenecks in MLNs. Recently, several approaches have been proposed that exploit approximate symmetries in the MLN to reduce inference complexity. These approaches approximate large domains containing many objects with much smaller domains of meta-objects (or cluster-centers), so that inference is considerably faster and more scalable. However, a drawback in most of these approaches is that it is typically very hard to tune the parameters (e.g., number of clusters) such that inference is both efficient and accurate. Here, we propose a novel non-parametric approach that trades-off solution quality with efficiency to automatically learn the optimal domain approximation. Further, we show how to perform Gibbs sampling effectively in a domain-approximated MLN by adapting the sampler according to the approximation. Our results on several benchmarks show that our approach is scalable, accurate and converges faster than existing methods.
Cite
Text
Venugopal et al. "Non-Parametric Domain Approximation for Scalable Gibbs Sampling in MLNs." Conference on Uncertainty in Artificial Intelligence, 2016.Markdown
[Venugopal et al. "Non-Parametric Domain Approximation for Scalable Gibbs Sampling in MLNs." Conference on Uncertainty in Artificial Intelligence, 2016.](https://mlanthology.org/uai/2016/venugopal2016uai-non/)BibTeX
@inproceedings{venugopal2016uai-non,
title = {{Non-Parametric Domain Approximation for Scalable Gibbs Sampling in MLNs}},
author = {Venugopal, Deepak and Sarkhel, Somdeb and Cherry, Kyle},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2016},
url = {https://mlanthology.org/uai/2016/venugopal2016uai-non/}
}