On Noise-Tolerant Learning of Sparse Parities and Related Problems

Abstract

We consider the problem of learning sparse parities in the presence of noise. For learning parities on r out of n variables, we give an algorithm that runs in time $\mathrm{poly}\left(\log \frac{1}{\delta}, \frac{1}{1-2\eta}\right) n^{ \left(1+(2\eta)^2+ o(1)\right)r/2}$ and uses only $\frac{r \log(n/\delta) \omega(1)}{(1 - 2\eta)^2}$ samples in the random noise setting under the uniform distribution, where η is the noise rate and δ is the confidence parameter. From previously known results this algorithm also works for adversarial noise and generalizes to arbitrary distributions. Even though efficient algorithms for learning sparse parities in the presence of noise would have major implications to learning other hypothesis classes, our work is the first to give a bound better than the brute-force O ( n ^ r ). As a consequence, we obtain the first nontrivial bound for learning r -juntas in the presence of noise, and also a small improvement in the complexity of learning DNF, under the uniform distribution.

Cite

Text

Grigorescu et al. "On Noise-Tolerant Learning of Sparse Parities and Related Problems." International Conference on Algorithmic Learning Theory, 2011. doi:10.1007/978-3-642-24412-4_32

Markdown

[Grigorescu et al. "On Noise-Tolerant Learning of Sparse Parities and Related Problems." International Conference on Algorithmic Learning Theory, 2011.](https://mlanthology.org/alt/2011/grigorescu2011alt-noisetolerant/) doi:10.1007/978-3-642-24412-4_32

BibTeX

@inproceedings{grigorescu2011alt-noisetolerant,
  title     = {{On Noise-Tolerant Learning of Sparse Parities and Related Problems}},
  author    = {Grigorescu, Elena and Reyzin, Lev and Vempala, Santosh S.},
  booktitle = {International Conference on Algorithmic Learning Theory},
  year      = {2011},
  pages     = {413-424},
  doi       = {10.1007/978-3-642-24412-4_32},
  url       = {https://mlanthology.org/alt/2011/grigorescu2011alt-noisetolerant/}
}