Null Space Versus Orthogonal Linear Discriminant Analysis
Abstract
Dimensionality reduction is an important pre-processing step for many applications. Linear Discriminant Analysis (LDA) is one of the well known methods for supervised dimensionality reduction. However, the classical LDA formulation requires the nonsingularity of scatter matrices involved. For undersampled problems, where the data dimension is much larger than the sample size, all scatter matrices are singular and classical LDA fails. Many extensions, including null space based LDA (NLDA), orthogonal LDA (OLDA), etc, have been proposed in the past to overcome this problem. In this paper, we present a computational and theoretical analysis of NLDA and OLDA. Our main result shows that under a mild condition which holds in many applications involving high-dimensional data, NLDA is equivalent to OLDA. We have performed extensive experiments on various types of data and results are consistent with our theoretical analysis. The presented analysis and experimental results provide further insight into several LDA based algorithms.
Cite
Text
Ye and Xiong. "Null Space Versus Orthogonal Linear Discriminant Analysis." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143979Markdown
[Ye and Xiong. "Null Space Versus Orthogonal Linear Discriminant Analysis." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/ye2006icml-null/) doi:10.1145/1143844.1143979BibTeX
@inproceedings{ye2006icml-null,
title = {{Null Space Versus Orthogonal Linear Discriminant Analysis}},
author = {Ye, Jieping and Xiong, Tao},
booktitle = {International Conference on Machine Learning},
year = {2006},
pages = {1073-1080},
doi = {10.1145/1143844.1143979},
url = {https://mlanthology.org/icml/2006/ye2006icml-null/}
}