Kernel Machines and Boolean Functions
Abstract
We give results about the learnability and required complexity of logical formulae to solve classification problems. These results are obtained by linking propositional logic with kernel machines. In particular we show that decision trees and disjunctive normal forms (DNF) can be repre- sented by the help of a special kernel, linking regularized risk to separa- tion margin. Subsequently we derive a number of lower bounds on the required complexity of logic formulae using properties of algorithms for generation of linear estimators, such as perceptron and maximal percep- tron learning.
Cite
Text
Kowalczyk et al. "Kernel Machines and Boolean Functions." Neural Information Processing Systems, 2001.Markdown
[Kowalczyk et al. "Kernel Machines and Boolean Functions." Neural Information Processing Systems, 2001.](https://mlanthology.org/neurips/2001/kowalczyk2001neurips-kernel/)BibTeX
@inproceedings{kowalczyk2001neurips-kernel,
title = {{Kernel Machines and Boolean Functions}},
author = {Kowalczyk, Adam and Smola, Alex J. and Williamson, Robert C.},
booktitle = {Neural Information Processing Systems},
year = {2001},
pages = {439-446},
url = {https://mlanthology.org/neurips/2001/kowalczyk2001neurips-kernel/}
}