Nearly Uniform Validation Improves Compression-Based Error Bounds

Abstract

This paper develops bounds on out-of-sample error rates for support vector machines (SVMs). The bounds are based on the numbers of support vectors in the SVMs rather than on VC dimension. The bounds developed here improve on support vector counting bounds derived using Littlestone and Warmuth's compression-based bounding technique.

Cite

Text

Bax. "Nearly Uniform Validation Improves Compression-Based Error Bounds." Journal of Machine Learning Research, 2008.

Markdown

[Bax. "Nearly Uniform Validation Improves Compression-Based Error Bounds." Journal of Machine Learning Research, 2008.](https://mlanthology.org/jmlr/2008/bax2008jmlr-nearly/)

BibTeX

@article{bax2008jmlr-nearly,
  title     = {{Nearly Uniform Validation Improves Compression-Based Error Bounds}},
  author    = {Bax, Eric},
  journal   = {Journal of Machine Learning Research},
  year      = {2008},
  pages     = {1741-1755},
  volume    = {9},
  url       = {https://mlanthology.org/jmlr/2008/bax2008jmlr-nearly/}
}