Second Order Cone Programming Formulations for Feature Selection

Abstract

This paper addresses the issue of feature selection for linear classifiers given the moments of the class conditional densities. The problem is posed as finding a minimal set of features such that the resulting classifier has a low misclassification error. Using a bound on the misclassification error involving the mean and covariance of class conditional densities and minimizing an L1 norm as an approximate criterion for feature selection, a second order programming formulation is derived. To handle errors in estimation of mean and covariances, a tractable robust formulation is also discussed. In a slightly different setting the Fisher discriminant is derived. Feature selection for Fisher discriminant is also discussed. Experimental results on synthetic data sets and on real life microarray data show that the proposed formulations are competitive with the state of the art linear programming formulation.

Cite

Text

Bhattacharyya. "Second Order Cone Programming Formulations for Feature Selection." Journal of Machine Learning Research, 2004.

Markdown

[Bhattacharyya. "Second Order Cone Programming Formulations for Feature Selection." Journal of Machine Learning Research, 2004.](https://mlanthology.org/jmlr/2004/bhattacharyya2004jmlr-second/)

BibTeX

@article{bhattacharyya2004jmlr-second,
  title     = {{Second Order Cone Programming Formulations for Feature Selection}},
  author    = {Bhattacharyya, Chiranjib},
  journal   = {Journal of Machine Learning Research},
  year      = {2004},
  pages     = {1417-1433},
  volume    = {5},
  url       = {https://mlanthology.org/jmlr/2004/bhattacharyya2004jmlr-second/}
}