Computational and Theoretical Analysis of Null Space and Orthogonal Linear Discriminant Analysis

Abstract

Dimensionality reduction is an important pre-processing step in many applications. Linear discriminant analysis (LDA) is a classical statistical approach for supervised dimensionality reduction. It aims to maximize the ratio of the between-class distance to the within-class distance, thus maximizing the class discrimination. It has been used widely in many applications. However, the classical LDA formulation requires the nonsingularity of the scatter matrices involved. For undersampled problems, where the data dimensionality is much larger than the sample size, all scatter matrices are singular and classical LDA fails. Many extensions, including null space LDA (NLDA) and orthogonal LDA (OLDA), have been proposed in the past to overcome this problem. NLDA aims to maximize the between-class distance in the null space of the within-class scatter matrix, while OLDA computes a set of orthogonal discriminant vectors via the simultaneous diagonalization of the scatter matrices. They have been applied successfully in various applications.

Cite

Text

Ye and Xiong. "Computational and Theoretical Analysis of  Null Space  and Orthogonal Linear Discriminant Analysis." Journal of Machine Learning Research, 2006.

Markdown

[Ye and Xiong. "Computational and Theoretical Analysis of  Null Space  and Orthogonal Linear Discriminant Analysis." Journal of Machine Learning Research, 2006.](https://mlanthology.org/jmlr/2006/ye2006jmlr-computational/)

BibTeX

@article{ye2006jmlr-computational,
  title     = {{Computational and Theoretical Analysis of  Null Space  and Orthogonal Linear Discriminant Analysis}},
  author    = {Ye, Jieping and Xiong, Tao},
  journal   = {Journal of Machine Learning Research},
  year      = {2006},
  pages     = {1183-1204},
  volume    = {7},
  url       = {https://mlanthology.org/jmlr/2006/ye2006jmlr-computational/}
}