Log-Hilbert-Schmidt Metric Between Positive Definite Operators on Hilbert Spaces

Abstract

This paper introduces a novel mathematical and computational framework, namely {\it Log-Hilbert-Schmidt metric} between positive definite operators on a Hilbert space. This is a generalization of the Log-Euclidean metric on the Riemannian manifold of positive definite matrices to the infinite-dimensional setting. The general framework is applied in particular to compute distances between covariance operators on a Reproducing Kernel Hilbert Space (RKHS), for which we obtain explicit formulas via the corresponding Gram matrices. Empirically, we apply our formulation to the task of multi-category image classification, where each image is represented by an infinite-dimensional RKHS covariance operator. On several challenging datasets, our method significantly outperforms approaches based on covariance matrices computed directly on the original input features, including those using the Log-Euclidean metric, Stein and Jeffreys divergences, achieving new state of the art results.

Cite

Text

Quang et al. "Log-Hilbert-Schmidt Metric Between Positive Definite Operators on Hilbert Spaces." Neural Information Processing Systems, 2014.

Markdown

[Quang et al. "Log-Hilbert-Schmidt Metric Between Positive Definite Operators on Hilbert Spaces." Neural Information Processing Systems, 2014.](https://mlanthology.org/neurips/2014/quang2014neurips-loghilbertschmidt/)

BibTeX

@inproceedings{quang2014neurips-loghilbertschmidt,
  title     = {{Log-Hilbert-Schmidt Metric Between Positive Definite Operators on Hilbert Spaces}},
  author    = {Quang, Minh Ha and Biagio, Marco San and Murino, Vittorio},
  booktitle = {Neural Information Processing Systems},
  year      = {2014},
  pages     = {388-396},
  url       = {https://mlanthology.org/neurips/2014/quang2014neurips-loghilbertschmidt/}
}