Complexity of Computing Vapnik-Chervonenkis Dimension
Abstract
The Vapnik-Chervonenkis (VC) dimension is known to be the crucial measure of the polynomial-sample learnability in the PAC-learning model. This paper investigates the complexity of computing VC-dimension of a concept class over a finite learning domain. We consider a decision problem called the discrete VC-dimension problem which is, for a given matrix representing a concept class F and an integer K , to determine whether the VC-dimension of F is greater than K or not. We prove that (1) the discrete VC-dimension problem is polynomial-time reducible to the satisfiability problem of length J with O (log^2 J ) variables, and (2) for every constant C , the satisfiability problem in conjunctive normal form with m clauses and C log^2 m variables is polynomial-time reducible to the discrete VC-dimension problem. These results can be interpreted, in some sense, that the problem is “complete” for the class of n ^ O (log n time computable sets.
Cite
Text
Shinohara. "Complexity of Computing Vapnik-Chervonenkis Dimension." International Conference on Algorithmic Learning Theory, 1993. doi:10.1007/3-540-57370-4_54Markdown
[Shinohara. "Complexity of Computing Vapnik-Chervonenkis Dimension." International Conference on Algorithmic Learning Theory, 1993.](https://mlanthology.org/alt/1993/shinohara1993alt-complexity/) doi:10.1007/3-540-57370-4_54BibTeX
@inproceedings{shinohara1993alt-complexity,
title = {{Complexity of Computing Vapnik-Chervonenkis Dimension}},
author = {Shinohara, Ayumi},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {1993},
pages = {279-287},
doi = {10.1007/3-540-57370-4_54},
url = {https://mlanthology.org/alt/1993/shinohara1993alt-complexity/}
}