Information Diffusion Kernels
Abstract
A new family of kernels for statistical learning is introduced that ex- ploits the geometric structure of statistical models. Based on the heat equation on the Riemannian manifold defined by the Fisher informa- tion metric, information diffusion kernels generalize the Gaussian kernel of Euclidean space, and provide a natural way of combining generative statistical modeling with non-parametric discriminative learning. As a special case, the kernels give a new approach to applying kernel-based learning algorithms to discrete data. Bounds on covering numbers for the new kernels are proved using spectral theory in differential geometry, and experimental results are presented for text classification.
Cite
Text
Lebanon and Lafferty. "Information Diffusion Kernels." Neural Information Processing Systems, 2002.Markdown
[Lebanon and Lafferty. "Information Diffusion Kernels." Neural Information Processing Systems, 2002.](https://mlanthology.org/neurips/2002/lebanon2002neurips-information/)BibTeX
@inproceedings{lebanon2002neurips-information,
title = {{Information Diffusion Kernels}},
author = {Lebanon, Guy and Lafferty, John D.},
booktitle = {Neural Information Processing Systems},
year = {2002},
pages = {391-398},
url = {https://mlanthology.org/neurips/2002/lebanon2002neurips-information/}
}