DeepStochLog: Neural Stochastic Logic Programming

Abstract

Recent advances in neural-symbolic learning, such as DeepProbLog, extend probabilistic logic programs with neural predicates. Like graphical models, these probabilistic logic programs define a probability distribution over possible worlds, for which inference is computationally hard. We propose DeepStochLog, an alternative neural-symbolic framework based on stochastic definite clause grammars, a kind of stochastic logic program. More specifically, we introduce neural grammar rules into stochastic definite clause grammars to create a framework that can be trained end-to-end. We show that inference and learning in neural stochastic logic programming scale much better than for neural probabilistic logic programs. Furthermore, the experimental evaluation shows that DeepStochLog achieves state-of-the-art results on challenging neural-symbolic learning tasks.

Cite

Text

Winters et al. "DeepStochLog: Neural Stochastic Logic Programming." AAAI Conference on Artificial Intelligence, 2022. doi:10.1609/AAAI.V36I9.21248

Markdown

[Winters et al. "DeepStochLog: Neural Stochastic Logic Programming." AAAI Conference on Artificial Intelligence, 2022.](https://mlanthology.org/aaai/2022/winters2022aaai-deepstochlog/) doi:10.1609/AAAI.V36I9.21248

BibTeX

@inproceedings{winters2022aaai-deepstochlog,
  title     = {{DeepStochLog: Neural Stochastic Logic Programming}},
  author    = {Winters, Thomas and Marra, Giuseppe and Manhaeve, Robin and De Raedt, Luc},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2022},
  pages     = {10090-10100},
  doi       = {10.1609/AAAI.V36I9.21248},
  url       = {https://mlanthology.org/aaai/2022/winters2022aaai-deepstochlog/}
}