Complexity of Computing Generalized VC-Dimensions

Abstract

In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the polynomial-sample learnability of a class of binary functions. For a class of 0,..., m-valued functions, the notion has been generalized in various ways. This paper investigates the complexity of computing some of generalized VC-dimensions: VC^*-dimension, Ψ_*-dimension, and Ψ_G-dimension. For each dimension, we consider a decision problem that is, for a given matrix representing a class F of functions and an integer K , to determine whether the dimension of F is greater than K or not. We prove that the VC^*-dimension problem is polynomial-time reducible to the satisfiability problem of length J with O (log^2 J) variables, which includes the original VC-dimension problem as a special case. We also show that the Ψ_G-dimension problem is still reducible to the satisfiability problem of length J with O (log^2 J), while the Ψ_*-dimension problem becomes NP -complete.

Cite

Text

Shinohara. "Complexity of Computing Generalized VC-Dimensions." European Conference on Machine Learning, 1994. doi:10.1007/3-540-57868-4_87

Markdown

[Shinohara. "Complexity of Computing Generalized VC-Dimensions." European Conference on Machine Learning, 1994.](https://mlanthology.org/ecmlpkdd/1994/shinohara1994ecml-complexity/) doi:10.1007/3-540-57868-4_87

BibTeX

@inproceedings{shinohara1994ecml-complexity,
  title     = {{Complexity of Computing Generalized VC-Dimensions}},
  author    = {Shinohara, Ayumi},
  booktitle = {European Conference on Machine Learning},
  year      = {1994},
  pages     = {415-418},
  doi       = {10.1007/3-540-57868-4_87},
  url       = {https://mlanthology.org/ecmlpkdd/1994/shinohara1994ecml-complexity/}
}