Compositional Noisy-Logical Learning

Abstract

We describe a new method for learning the conditional probability distribution of a binary-valued variable from labelled training examples. Our proposed Compositional Noisy-Logical Learning (CNLL) approach learns a noisy-logical distribution in a compositional manner. CNLL is an alternative to the well-known AdaBoost algorithm which performs coordinate descent on an alternative error measure. We describe two CNLL algorithms and test their performance compared to AdaBoost on two types of problem: (i) noisy-logical data (such as noisy exclusive-or), and (ii) four standard datasets from the UCI repository. Our results show that we outperform AdaBoost while using significantly fewer weak classifiers, thereby giving a more transparent classifier suitable for knowledge extraction.

Cite

Text

Yuille and Zheng. "Compositional Noisy-Logical Learning." International Conference on Machine Learning, 2009. doi:10.1145/1553374.1553528

Markdown

[Yuille and Zheng. "Compositional Noisy-Logical Learning." International Conference on Machine Learning, 2009.](https://mlanthology.org/icml/2009/yuille2009icml-compositional/) doi:10.1145/1553374.1553528

BibTeX

@inproceedings{yuille2009icml-compositional,
  title     = {{Compositional Noisy-Logical Learning}},
  author    = {Yuille, Alan L. and Zheng, Songfeng},
  booktitle = {International Conference on Machine Learning},
  year      = {2009},
  pages     = {1209-1216},
  doi       = {10.1145/1553374.1553528},
  url       = {https://mlanthology.org/icml/2009/yuille2009icml-compositional/}
}