Mansour's Conjecture Is True for Random DNF Formulas
Abstract
In 1994, Y. Mansour conjectured that for every DNF formula on n variables with t terms there exists a polynomial p with $t^{O(\log(1/\varepsilon))}$ non-zero coefficients such that $\mathbb{E}_{x \in \{0,1\}^n}[(p(x)-f(x))^2] \leq \varepsilon$. We make the first progress on this conjecture and show that it is true for several natural subclasses of DNF formulas including randomly chosen DNF formulas and read-k DNF formulas for constant k. Our result yields the first polynomial-time query algorithm for agnostically learning these subclasses of DNF formulas with respect to the uniform distribution on $\{0,1\}^n$ (for any constant error parameter). Applying recent work on sandwiching polynomials, our results imply that a $t^{-O(\log 1/\varepsilon)}$-biased distribution fools the above subclasses of DNF formulas. This gives pseudorandom generators for these subclasses with shorter seed length than all previous work.
Cite
Text
Klivans et al. "Mansour's Conjecture Is True for Random DNF Formulas." Annual Conference on Computational Learning Theory, 2010.Markdown
[Klivans et al. "Mansour's Conjecture Is True for Random DNF Formulas." Annual Conference on Computational Learning Theory, 2010.](https://mlanthology.org/colt/2010/klivans2010colt-mansour/)BibTeX
@inproceedings{klivans2010colt-mansour,
title = {{Mansour's Conjecture Is True for Random DNF Formulas}},
author = {Klivans, Adam R. and Lee, Homin K. and Wan, Andrew},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2010},
pages = {368-380},
url = {https://mlanthology.org/colt/2010/klivans2010colt-mansour/}
}