Computable Shell Decomposition Bounds

Abstract

Haussler, Kearns, Seung and Tishby introduced the notion of a shell decomposition of the union bound as a means of understanding certain empirical phenomena in learning curves such as phase transitions. Here we use a variant of their ideas to derive an upper bound on the generalization error of a hypothesis computable from its training error and the histogram of training errors for the hypotheses in the class. In most cases this new bound is significantly tighter than traditional bounds computed from the training error and the cardinality of the class. Our results can also be viewed as providing a rigorous foundation for a model selection algorithm proposed by Scheffer and Joachims.

Cite

Text

Langford and McAllester. "Computable Shell Decomposition Bounds." Journal of Machine Learning Research, 2004.

Markdown

[Langford and McAllester. "Computable Shell Decomposition Bounds." Journal of Machine Learning Research, 2004.](https://mlanthology.org/jmlr/2004/langford2004jmlr-computable/)

BibTeX

@article{langford2004jmlr-computable,
  title     = {{Computable Shell Decomposition Bounds}},
  author    = {Langford, John and McAllester, David},
  journal   = {Journal of Machine Learning Research},
  year      = {2004},
  pages     = {529-547},
  volume    = {5},
  url       = {https://mlanthology.org/jmlr/2004/langford2004jmlr-computable/}
}