Stable Transductive Learning
Abstract
We develop a new error bound for transductive learning algorithms. The slack term in the new bound is a function of a relaxed notion of transductive stability , which measures the sensitivity of the algorithm to most pairwise exchanges of training and test set points. Our bound is based on a novel concentration inequality for symmetric functions of permutations. We also present a simple sampling technique that can estimate, with high probability, the weak stability of transductive learning algorithms with respect to a given dataset. We demonstrate the usefulness of our estimation technique on a well known transductive learning algorithm.
Cite
Text
El-Yaniv and Pechyony. "Stable Transductive Learning." Annual Conference on Computational Learning Theory, 2006. doi:10.1007/11776420_6Markdown
[El-Yaniv and Pechyony. "Stable Transductive Learning." Annual Conference on Computational Learning Theory, 2006.](https://mlanthology.org/colt/2006/elyaniv2006colt-stable/) doi:10.1007/11776420_6BibTeX
@inproceedings{elyaniv2006colt-stable,
title = {{Stable Transductive Learning}},
author = {El-Yaniv, Ran and Pechyony, Dmitry},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2006},
pages = {35-49},
doi = {10.1007/11776420_6},
url = {https://mlanthology.org/colt/2006/elyaniv2006colt-stable/}
}