An Efficient Image Similarity Measure Based on Approximations of KL-Divergence Between Two Gaussian Mixtures

Abstract

We present two new methods for approximating the Kullback-Liebler (KL) divergence between two mixtures of Gaussians. The first method is based on matching between the Gaussian elements of the two Gaussian mixture densities. The second method is based on the unscented transform. The proposed methods are utilized for image retrieval tasks. Continuous probabilistic image modeling based on mixtures of Gaussians together with KL measure for image similarity, can be used for image retrieval tasks with remarkable performance. The efficiency and the performance of the KL approximation methods proposed are demonstrated on both simulated data and real image data sets. The experimental results indicate that our proposed approximations outperform previously suggested methods.

Cite

Text

Goldberger et al. "An Efficient Image Similarity Measure Based on Approximations of KL-Divergence Between Two Gaussian Mixtures." IEEE/CVF International Conference on Computer Vision, 2003. doi:10.1109/ICCV.2003.1238387

Markdown

[Goldberger et al. "An Efficient Image Similarity Measure Based on Approximations of KL-Divergence Between Two Gaussian Mixtures." IEEE/CVF International Conference on Computer Vision, 2003.](https://mlanthology.org/iccv/2003/goldberger2003iccv-efficient/) doi:10.1109/ICCV.2003.1238387

BibTeX

@inproceedings{goldberger2003iccv-efficient,
  title     = {{An Efficient Image Similarity Measure Based on Approximations of KL-Divergence Between Two Gaussian Mixtures}},
  author    = {Goldberger, Jacob and Gordon, Shiri and Greenspan, Hayit},
  booktitle = {IEEE/CVF International Conference on Computer Vision},
  year      = {2003},
  pages     = {487-493},
  doi       = {10.1109/ICCV.2003.1238387},
  url       = {https://mlanthology.org/iccv/2003/goldberger2003iccv-efficient/}
}