Nonextensive Entropic Kernels

Abstract

Positive definite kernels on probability measures have been recently applied in structured data classification problems. Some of these kernels are related to classic information theoretic quantities, such as mutual information and the Jensen-Shannon divergence. Meanwhile, driven by recent advances in Tsallis statistics, nonextensive generalizations of Shannon�s information theory have been proposed. This paper bridges these two trends. We introduce the Jensen-Tsallis q-difference, a generalization of the Jensen-Shannon divergence. We then define a new family of nonextensive mutual information kernels, which allow weights to be assigned to their arguments, and which includes the Boolean, Jensen-Shannon, and linear kernels as particular cases. We illustrate the performance of these kernels on text categorization tasks.

Cite

Text

Martins et al. "Nonextensive Entropic Kernels." International Conference on Machine Learning, 2008. doi:10.1145/1390156.1390237

Markdown

[Martins et al. "Nonextensive Entropic Kernels." International Conference on Machine Learning, 2008.](https://mlanthology.org/icml/2008/martins2008icml-nonextensive/) doi:10.1145/1390156.1390237

BibTeX

@inproceedings{martins2008icml-nonextensive,
  title     = {{Nonextensive Entropic Kernels}},
  author    = {Martins, André F. T. and Figueiredo, Mário A. T. and Aguiar, Pedro M. Q. and Smith, Noah A. and Xing, Eric P.},
  booktitle = {International Conference on Machine Learning},
  year      = {2008},
  pages     = {640-647},
  doi       = {10.1145/1390156.1390237},
  url       = {https://mlanthology.org/icml/2008/martins2008icml-nonextensive/}
}