Lifted Message Passing for Hybrid Probabilistic Inference

Abstract

Lifted inference algorithms for first-order logic models, e.g., Markov logic networks (MLNs), have been of significant interest in recent years.  Lifted inference methods exploit model symmetries in order to reduce the size of the model and, consequently, the computational cost of inference.  In this work, we consider the problem of lifted inference in MLNs with continuous or both discrete and continuous groundings. Existing work on lifting with continuous groundings has mostly been limited to special classes of models, e.g., Gaussian models, for which variable elimination or message-passing updates can be computed exactly.  Here, we develop approximate lifted inference schemes based on particle sampling.  We demonstrate empirically that our approximate lifting schemes perform comparably to existing state-of-the-art for models for Gaussian MLNs, while having the flexibility to be applied to models with arbitrary potential functions.

Cite

Text

Chen et al. "Lifted Message Passing for Hybrid Probabilistic Inference." International Joint Conference on Artificial Intelligence, 2019. doi:10.24963/IJCAI.2019/790

Markdown

[Chen et al. "Lifted Message Passing for Hybrid Probabilistic Inference." International Joint Conference on Artificial Intelligence, 2019.](https://mlanthology.org/ijcai/2019/chen2019ijcai-lifted/) doi:10.24963/IJCAI.2019/790

BibTeX

@inproceedings{chen2019ijcai-lifted,
  title     = {{Lifted Message Passing for Hybrid Probabilistic Inference}},
  author    = {Chen, Yuqiao and Ruozzi, Nicholas and Natarajan, Sriraam},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2019},
  pages     = {5701-5707},
  doi       = {10.24963/IJCAI.2019/790},
  url       = {https://mlanthology.org/ijcai/2019/chen2019ijcai-lifted/}
}