Metric Embedding for Kernel Classification Rules
Abstract
In this paper, we consider a smoothing kernel-based classification rule and propose an algorithm for optimizing the performance of the rule by learning the bandwidth of the smoothing kernel along with a data-dependent distance metric. The data-dependent distance metric is obtained by learning a function that embeds an arbitrary metric space into a Euclidean space while minimizing an upper bound on the resubstitution estimate of the error probability of the kernel classification rule. By restricting this embedding function to a reproducing kernel Hilbert space, we reduce the problem to solving a semidefinite program and show the resulting kernel classification rule to be a variation of the k-nearest neighbor rule. We compare the performance of the kernel rule (using the learned data-dependent distance metric) to state-of-the-art distance metric learning algorithms (designed for k-nearest neighbor classification) on some benchmark datasets. The results show that the proposed rule has either better or as good classification accuracy as the other metric learning algorithms.
Cite
Text
Sriperumbudur et al. "Metric Embedding for Kernel Classification Rules." International Conference on Machine Learning, 2008. doi:10.1145/1390156.1390283Markdown
[Sriperumbudur et al. "Metric Embedding for Kernel Classification Rules." International Conference on Machine Learning, 2008.](https://mlanthology.org/icml/2008/sriperumbudur2008icml-metric/) doi:10.1145/1390156.1390283BibTeX
@inproceedings{sriperumbudur2008icml-metric,
title = {{Metric Embedding for Kernel Classification Rules}},
author = {Sriperumbudur, Bharath K. and Lang, Omer A. and Lanckriet, Gert R. G.},
booktitle = {International Conference on Machine Learning},
year = {2008},
pages = {1008-1015},
doi = {10.1145/1390156.1390283},
url = {https://mlanthology.org/icml/2008/sriperumbudur2008icml-metric/}
}