Two Dimensional Large Margin Nearest Neighbor for Matrix Classification
Abstract
Matrices are common forms of data that are encountered in a wide range of real applications. How to classify this kind of data is an important research topic. In this paper, we propose a novel distance metric learning method named two dimensional large margin nearest neighbor (2DLMNNN), for improving the performance of k nearest neighbor (KNN) classifier in matrix classification. In the proposed method, left and right projection matrices are employed to define the matrix-based Mahalanobis distance, which is used to construct the objective aimed at separating points in different classes by a large margin. The parameters in those two projection matrices are much less than that in its vector-based counterpart, thus our method reduces the risks of overfitting. We also introduce a framework for solving the proposed 2DLMNN. The convergence behavior, initialization, and parameter determination are also analyzed. Compared with vector-based methods, 2DLMNN performs better for matrix data classification. Promising experimental results on several data sets are provided to demonstrate the effectiveness of our method.
Cite
Text
Song et al. "Two Dimensional Large Margin Nearest Neighbor for Matrix Classification." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/383Markdown
[Song et al. "Two Dimensional Large Margin Nearest Neighbor for Matrix Classification." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/song2017ijcai-two/) doi:10.24963/IJCAI.2017/383BibTeX
@inproceedings{song2017ijcai-two,
title = {{Two Dimensional Large Margin Nearest Neighbor for Matrix Classification}},
author = {Song, Kun and Nie, Feiping and Han, Junwei},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2017},
pages = {2751-2757},
doi = {10.24963/IJCAI.2017/383},
url = {https://mlanthology.org/ijcai/2017/song2017ijcai-two/}
}