Regularized Discriminant Analysis, Ridge Regression and Beyond
Abstract
Fisher linear discriminant analysis (FDA) and its kernel extension−kernel discriminant analysis (KDA)−are well known methods that consider dimensionality reduction and classification jointly. While widely deployed in practical problems, there are still unresolved issues surrounding their efficient implementation and their relationship with least mean squares procedures. In this paper we address these issues within the framework of regularized estimation. Our approach leads to a flexible and efficient implementation of FDA as well as KDA. We also uncover a general relationship between regularized discriminant analysis and ridge regression. This relationship yields variations on conventional FDA based on the pseudoinverse and a direct equivalence to an ordinary least squares estimator.
Cite
Text
Zhang et al. "Regularized Discriminant Analysis, Ridge Regression and Beyond." Journal of Machine Learning Research, 2010.Markdown
[Zhang et al. "Regularized Discriminant Analysis, Ridge Regression and Beyond." Journal of Machine Learning Research, 2010.](https://mlanthology.org/jmlr/2010/zhang2010jmlr-regularized/)BibTeX
@article{zhang2010jmlr-regularized,
title = {{Regularized Discriminant Analysis, Ridge Regression and Beyond}},
author = {Zhang, Zhihua and Dai, Guang and Xu, Congfu and Jordan, Michael I.},
journal = {Journal of Machine Learning Research},
year = {2010},
pages = {2199-2228},
volume = {11},
url = {https://mlanthology.org/jmlr/2010/zhang2010jmlr-regularized/}
}