Complexity Theoretic Limitations on Learning DNF's

Abstract

Using the recently developed framework of Daniely, Linial and Shalev-Shwartz, we show that under a natural assumption on the complexity of random K-SAT, learning DNF formulas is hard. Furthermore, the same assumption implies the hardness of various learning problems, including intersections of logarithmically many halfspaces, agnostically learning conjunctions, as well as virtually all (distribution free) learning problems that were previously shown hard (under various complexity assumptions).

Cite

Text

Daniely and Shalev-Shwartz. "Complexity Theoretic Limitations on Learning DNF's." Annual Conference on Computational Learning Theory, 2016.

Markdown

[Daniely and Shalev-Shwartz. "Complexity Theoretic Limitations on Learning DNF's." Annual Conference on Computational Learning Theory, 2016.](https://mlanthology.org/colt/2016/daniely2016colt-complexity/)

BibTeX

@inproceedings{daniely2016colt-complexity,
  title     = {{Complexity Theoretic Limitations on Learning DNF's}},
  author    = {Daniely, Amit and Shalev-Shwartz, Shai},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2016},
  pages     = {815-830},
  url       = {https://mlanthology.org/colt/2016/daniely2016colt-complexity/}
}