Optimal Dimensionality of Metric Space for Classification
Abstract
In many real-world applications, Euclidean distance in the original space is not good due to the curse of dimensionality. In this paper, we propose a new method, called Discriminant Neighborhood Embedding (DNE), to learn an appropriate metric space for classification given finite training samples. We define a discriminant adjacent matrix in favor of classification task, i.e., neighboring samples in the same class are squeezed but those in different classes are separated as far as possible. The optimal dimensionality of the metric space can be estimated by spectral analysis in the proposed method, which is of great significance for high-dimensional patterns. Experiments with various datasets demonstrate the effectiveness of our method.
Cite
Text
Zhang et al. "Optimal Dimensionality of Metric Space for Classification." International Conference on Machine Learning, 2007. doi:10.1145/1273496.1273639Markdown
[Zhang et al. "Optimal Dimensionality of Metric Space for Classification." International Conference on Machine Learning, 2007.](https://mlanthology.org/icml/2007/zhang2007icml-optimal/) doi:10.1145/1273496.1273639BibTeX
@inproceedings{zhang2007icml-optimal,
title = {{Optimal Dimensionality of Metric Space for Classification}},
author = {Zhang, Wei and Xue, Xiangyang and Sun, Zichen and Guo, Yue-Fei and Lu, Hong},
booktitle = {International Conference on Machine Learning},
year = {2007},
pages = {1135-1142},
doi = {10.1145/1273496.1273639},
url = {https://mlanthology.org/icml/2007/zhang2007icml-optimal/}
}