Scaling the Weight Parameters in Markov Logic Networks and Relational Logistic Regression Models

Abstract

Extrapolation with domain size has received plenty of attention recently, both in its own right and as part of the broader issue of scaling inference and learning to large domains. We consider Markov logic networks and relational logistic regression as two fundamental representation formalisms in statistical relational artificial intelligence that use weighted formulas in their specification. However, Markov logic networks are based on undirected graphs, while relational logistic regression is based on directed acyclic graphs. We show that when scaling the weight parameters with the domain size, the asymptotic behaviour of a relational logistic regression model is transparently controlled by the parameters, and we supply an algorithm to compute asymptotic probabilities. We show using two examples that this is not true for Markov logic networks. We also discuss using several examples, mainly from the literature, how the application context can help the user to decide when such scaling is appropriate and when using the raw unscaled parameters might be preferable.

Cite

Text

Weitkämper. "Scaling the Weight Parameters in Markov Logic Networks and Relational Logistic Regression Models." Machine Learning, 2025. doi:10.1007/S10994-024-06635-7

Markdown

[Weitkämper. "Scaling the Weight Parameters in Markov Logic Networks and Relational Logistic Regression Models." Machine Learning, 2025.](https://mlanthology.org/mlj/2025/weitkamper2025mlj-scaling/) doi:10.1007/S10994-024-06635-7

BibTeX

@article{weitkamper2025mlj-scaling,
  title     = {{Scaling the Weight Parameters in Markov Logic Networks and Relational Logistic Regression Models}},
  author    = {Weitkämper, Felix Q.},
  journal   = {Machine Learning},
  year      = {2025},
  pages     = {85},
  doi       = {10.1007/S10994-024-06635-7},
  volume    = {114},
  url       = {https://mlanthology.org/mlj/2025/weitkamper2025mlj-scaling/}
}