First Order Regression

Abstract

We present a new approach, called First Order Regression (FOR), to handling numerical information in Inductive Logic Programming (ILP). FOR is a combination of ILP and numerical regression. First-order logic descriptions are induced to carve out those subspaces that are amenable to numerical regression among real-valued variables. The program FORS is an implementation of this idea, where numerical regression is focused on a distinguished continuous argument of the target predicate. We show that this can be viewed as a generalisation of the usual ILP problem. Applications of FORS on several real-world data sets are described: the prediction of mutagenicity of chemicals, the modelling of liquid dynamics in a surge tank, predicting the roughness in steel grinding, finite element mesh design, and operator's skill reconstruction in electric discharge machining. A comparison of FORS' performance with previous results in these domains indicates that FORS is an effective tool for ILP applications that involve numerical data.

Cite

Text

Karalic and Bratko. "First Order Regression." Machine Learning, 1997. doi:10.1023/A:1007365207130

Markdown

[Karalic and Bratko. "First Order Regression." Machine Learning, 1997.](https://mlanthology.org/mlj/1997/karalic1997mlj-first/) doi:10.1023/A:1007365207130

BibTeX

@article{karalic1997mlj-first,
  title     = {{First Order Regression}},
  author    = {Karalic, Aram and Bratko, Ivan},
  journal   = {Machine Learning},
  year      = {1997},
  pages     = {147-176},
  doi       = {10.1023/A:1007365207130},
  volume    = {26},
  url       = {https://mlanthology.org/mlj/1997/karalic1997mlj-first/}
}