Generalization Bounds for Learning with Linear, Polygonal, Quadratic and Conic Side Knowledge
Abstract
In this paper, we consider a supervised learning setting where side knowledge is provided about the labels of unlabeled examples. The side knowledge has the effect of reducing the hypothesis space, leading to tighter generalization bounds, and thus possibly better generalization. We consider several types of side knowledge, the first leading to linear and polygonal constraints on the hypothesis space, the second leading to quadratic constraints, and the last leading to conic constraints. We show how different types of domain knowledge can lead directly to these kinds of side knowledge. We prove bounds on complexity measures of the hypothesis space for quadratic and conic side knowledge, and show that these bounds are tight in a specific sense for the quadratic case.
Cite
Text
Tulabandhula and Rudin. "Generalization Bounds for Learning with Linear, Polygonal, Quadratic and Conic Side Knowledge." Machine Learning, 2015. doi:10.1007/S10994-014-5478-4Markdown
[Tulabandhula and Rudin. "Generalization Bounds for Learning with Linear, Polygonal, Quadratic and Conic Side Knowledge." Machine Learning, 2015.](https://mlanthology.org/mlj/2015/tulabandhula2015mlj-generalization/) doi:10.1007/S10994-014-5478-4BibTeX
@article{tulabandhula2015mlj-generalization,
title = {{Generalization Bounds for Learning with Linear, Polygonal, Quadratic and Conic Side Knowledge}},
author = {Tulabandhula, Theja and Rudin, Cynthia},
journal = {Machine Learning},
year = {2015},
pages = {183-216},
doi = {10.1007/S10994-014-5478-4},
volume = {100},
url = {https://mlanthology.org/mlj/2015/tulabandhula2015mlj-generalization/}
}