Learning Non-Metric Partial Similarity Based on Maximal Margin Criterion

Abstract

The performance of many computer vision and machine learning algorithms critically depends on the quality of the similarity measure defined over the feature space. Previous works usually utilize metric distances which are ofen epistemologically different from the perceptual distance of human beings. In this paper a novel non-metric partial similarity measure is introduced, which is born to automatically capture the prominent partial similarity between two images while ignoring the confusing unimportant dissimilarity. This measure is potentially useful in face recognition since it can help identify the inherent intra-personal similarity and thus reducing the influence caused by large variations such as expression and occlusions. Moreover; to make this method practical, this paper proposes an automatic and class-dependent similarity threshold setting mechanism based on the maximal margin criterion, and uses a Self- Organization Map-based embedding technique to alleviate the computational problem. Experimental results show the feasibility and effectiveness of the proposed method.

Cite

Text

Tan et al. "Learning Non-Metric Partial Similarity Based on Maximal Margin Criterion." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2006. doi:10.1109/CVPR.2006.170

Markdown

[Tan et al. "Learning Non-Metric Partial Similarity Based on Maximal Margin Criterion." IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2006.](https://mlanthology.org/cvpr/2006/tan2006cvpr-learning/) doi:10.1109/CVPR.2006.170

BibTeX

@inproceedings{tan2006cvpr-learning,
  title     = {{Learning Non-Metric Partial Similarity Based on Maximal Margin Criterion}},
  author    = {Tan, Xiaoyang and Chen, Songcan and Li, Jun and Zhou, Zhi-Hua},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {2006},
  pages     = {138-145},
  doi       = {10.1109/CVPR.2006.170},
  url       = {https://mlanthology.org/cvpr/2006/tan2006cvpr-learning/}
}