Canonical Horn Representations and Query Learning
Abstract
We describe an alternative construction of an existing canonical representation for definite Horn theories, the Guigues-Duquenne basis (or GD basis), which minimizes a natural notion of implicational size. We extend the canonical representation to general Horn, by providing a reduction from definite to general Horn CNF. We show how this representation relates to two topics in query learning theory: first, we show that a well-known algorithm by Angluin, Frazier and Pitt that learns Horn CNF always outputs the GD basis independently of the counterexamples it receives; second, we build strong polynomial certificates for Horn CNF directly from the GD basis.
Cite
Text
Arias and Balcázar. "Canonical Horn Representations and Query Learning." International Conference on Algorithmic Learning Theory, 2009. doi:10.1007/978-3-642-04414-4_16Markdown
[Arias and Balcázar. "Canonical Horn Representations and Query Learning." International Conference on Algorithmic Learning Theory, 2009.](https://mlanthology.org/alt/2009/arias2009alt-canonical/) doi:10.1007/978-3-642-04414-4_16BibTeX
@inproceedings{arias2009alt-canonical,
title = {{Canonical Horn Representations and Query Learning}},
author = {Arias, Marta and Balcázar, José L.},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {2009},
pages = {156-170},
doi = {10.1007/978-3-642-04414-4_16},
url = {https://mlanthology.org/alt/2009/arias2009alt-canonical/}
}