Generalization Error Bounds for Classifiers Trained with Interdependent Data
Abstract
In this paper we propose a general framework to study the generalization properties of binary classifiers trained with data which may be depen- dent, but are deterministically generated upon a sample of independent examples. It provides generalization bounds for binary classification and some cases of ranking problems, and clarifies the relationship between these learning tasks.
Cite
Text
Usunier et al. "Generalization Error Bounds for Classifiers Trained with Interdependent Data." Neural Information Processing Systems, 2005.Markdown
[Usunier et al. "Generalization Error Bounds for Classifiers Trained with Interdependent Data." Neural Information Processing Systems, 2005.](https://mlanthology.org/neurips/2005/usunier2005neurips-generalization/)BibTeX
@inproceedings{usunier2005neurips-generalization,
title = {{Generalization Error Bounds for Classifiers Trained with Interdependent Data}},
author = {Usunier, Nicolas and Amini, Massih R. and Gallinari, Patrick},
booktitle = {Neural Information Processing Systems},
year = {2005},
pages = {1369-1376},
url = {https://mlanthology.org/neurips/2005/usunier2005neurips-generalization/}
}