Tight Bounds on $\ell_1$ Approximation and Learning of Self-Bounding Functions

Abstract

We study the complexity of learning and approximation of self-bounding functions over the uniform distribution on the Boolean hypercube $\{0,1\}^n$. Informally, a function $f:\{0,1\}^n \rightarrow \mathbb{R}$ is self-bounding if for every $x \in \{0,1\}^n$, $f(x)$ upper bounds the sum of all the $n$ marginal decreases in the value of the function at $x$. Self-bounding functions include such well-known classes of functions as submodular and fractionally-subadditive (XOS) functions. They were introduced by Boucheron et al. in the context of concentration of measure inequalities. Our main result is a nearly tight $\ell_1$-approximation of self-bounding functions by low-degree juntas. Specifically, all self-bounding functions can be $\epsilon$-approximated in $\ell_1$ by a polynomial of degree $\tilde{O}(1/\epsilon)$ over $2^{\tilde{O}(1/\epsilon)}$ variables. We show that both the degree and junta-size are optimal up to logarithmic terms. Previous techniques considered stronger $\ell_2$ approximation and proved nearly tight bounds of $\Theta(1/\epsilon^{2})$ on the degree and $2^{\Theta(1/\epsilon^2)}$ on the number of variables. Our bounds rely on the analysis of noise stability of self-bounding functions together with a stronger connection between noise stability and $\ell_1$ approximation by low-degree polynomials. This technique can also be used to get tighter bounds on $\ell_1$ approximation by low-degree polynomials and faster learning algorithm for halfspaces. \newline These results lead to improved and in several cases almost tight bounds for PAC and agnostic learning of self-bounding functions relative to the uniform distribution. In particular, assuming hardness of learning juntas, we show that PAC and agnostic learning of self-bounding functions have complexity of $n^{\tilde{\Theta}(1/\epsilon)}$.

Cite

Text

Feldman et al. "Tight Bounds on $\ell_1$ Approximation and Learning of Self-Bounding Functions." Proceedings of the 28th International Conference on Algorithmic Learning Theory, 2017.

Markdown

[Feldman et al. "Tight Bounds on $\ell_1$ Approximation and Learning of Self-Bounding Functions." Proceedings of the 28th International Conference on Algorithmic Learning Theory, 2017.](https://mlanthology.org/alt/2017/feldman2017alt-tight/)

BibTeX

@inproceedings{feldman2017alt-tight,
  title     = {{Tight Bounds on $\ell_1$ Approximation and Learning of Self-Bounding Functions}},
  author    = {Feldman, Vitaly and Kothari, Pravesh and Vondrák, Jan},
  booktitle = {Proceedings of the 28th International Conference on Algorithmic Learning Theory},
  year      = {2017},
  pages     = {540-559},
  volume    = {76},
  url       = {https://mlanthology.org/alt/2017/feldman2017alt-tight/}
}