Feature Selection via Joint Embedding Learning and Sparse Regression
Abstract
The problem of feature selection has aroused considerable research interests in the past few years. Traditional learning based feature selection methods separate embedding learning and feature ranking. In this paper, we introduce a novel unsupervised feature selection approach via Joint Embedding Learning and Sparse Regression (JELSR). Instead of simply employing the graph laplacian for embedding learning and then regression, we use the weight via locally linear approximation to construct graph and unify embedding learning and sparse regression to perform feature selection. By adding the l2,1-norm regularization, we can learn a sparse matrix for feature ranking. We also provide an effective method to solve the proposed problem. Compared with traditional unsupervised feature selection methods, our approach could integrate the merits of embedding learning and sparse regression simultaneously. Plenty of experimental results are provided to show the validity.
Cite
Text
Hou et al. "Feature Selection via Joint Embedding Learning and Sparse Regression." International Joint Conference on Artificial Intelligence, 2011. doi:10.5591/978-1-57735-516-8/IJCAI11-224Markdown
[Hou et al. "Feature Selection via Joint Embedding Learning and Sparse Regression." International Joint Conference on Artificial Intelligence, 2011.](https://mlanthology.org/ijcai/2011/hou2011ijcai-feature/) doi:10.5591/978-1-57735-516-8/IJCAI11-224BibTeX
@inproceedings{hou2011ijcai-feature,
title = {{Feature Selection via Joint Embedding Learning and Sparse Regression}},
author = {Hou, Chenping and Nie, Feiping and Yi, Dongyun and Wu, Yi},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2011},
pages = {1324-1329},
doi = {10.5591/978-1-57735-516-8/IJCAI11-224},
url = {https://mlanthology.org/ijcai/2011/hou2011ijcai-feature/}
}