A Kernel Between Sets of Vectors

Abstract

In various application domains, including image recognition, it is natural to represent each example as a set of vectors. With a base kernel we can implicitly map these vectors to a Hilbert space and fit a Gaussian distribution to the whole set using Kernel PCA. We define our kernel between examples as Bhattacharyya’s measure of affinity between such Gaussians. The resulting kernel is computable in closed form and enjoys many favorable properties, including graceful behavior under transformations, potentially justifying the vector set representation even in cases when more conventional representations also exist. ICML Proceedings of the Twentieth International Conference on Machine Learning

Cite

Text

Kondor and Jebara. "A Kernel Between Sets of Vectors." International Conference on Machine Learning, 2003.

Markdown

[Kondor and Jebara. "A Kernel Between Sets of Vectors." International Conference on Machine Learning, 2003.](https://mlanthology.org/icml/2003/kondor2003icml-kernel/)

BibTeX

@inproceedings{kondor2003icml-kernel,
  title     = {{A Kernel Between Sets of Vectors}},
  author    = {Kondor, Risi and Jebara, Tony},
  booktitle = {International Conference on Machine Learning},
  year      = {2003},
  pages     = {361-368},
  url       = {https://mlanthology.org/icml/2003/kondor2003icml-kernel/}
}