The Entropy Regularization Information Criterion
Abstract
Effective methods of capacity control via uniform convergence bounds for function expansions have been largely limited to Support Vector ma(cid:173) chines, where good bounds are obtainable by the entropy number ap(cid:173) proach. We extend these methods to systems with expansions in terms of arbitrary (parametrized) basis functions and a wide range of regulariza(cid:173) tion methods covering the whole range of general linear additive models. This is achieved by a data dependent analysis of the eigenvalues of the corresponding design matrix.
Cite
Text
Smola et al. "The Entropy Regularization Information Criterion." Neural Information Processing Systems, 1999.Markdown
[Smola et al. "The Entropy Regularization Information Criterion." Neural Information Processing Systems, 1999.](https://mlanthology.org/neurips/1999/smola1999neurips-entropy/)BibTeX
@inproceedings{smola1999neurips-entropy,
title = {{The Entropy Regularization Information Criterion}},
author = {Smola, Alex J. and Shawe-Taylor, John and Schölkopf, Bernhard and Williamson, Robert C.},
booktitle = {Neural Information Processing Systems},
year = {1999},
pages = {342-348},
url = {https://mlanthology.org/neurips/1999/smola1999neurips-entropy/}
}