Embedded Unsupervised Feature Selection

Abstract

Sparse learning has been proven to be a powerful techniquein supervised feature selection, which allows toembed feature selection into the classification (or regression)problem. In recent years, increasing attentionhas been on applying spare learning in unsupervisedfeature selection. Due to the lack of label information,the vast majority of these algorithms usually generatecluster labels via clustering algorithms and then formulateunsupervised feature selection as sparse learningbased supervised feature selection with these generatedcluster labels. In this paper, we propose a novel unsupervisedfeature selection algorithm EUFS, which directlyembeds feature selection into a clustering algorithm viasparse learning without the transformation. The AlternatingDirection Method of Multipliers is used to addressthe optimization problem of EUFS. Experimentalresults on various benchmark datasets demonstrate theeffectiveness of the proposed framework EUFS.

Cite

Text

Wang et al. "Embedded Unsupervised Feature Selection." AAAI Conference on Artificial Intelligence, 2015. doi:10.1609/AAAI.V29I1.9211

Markdown

[Wang et al. "Embedded Unsupervised Feature Selection." AAAI Conference on Artificial Intelligence, 2015.](https://mlanthology.org/aaai/2015/wang2015aaai-embedded/) doi:10.1609/AAAI.V29I1.9211

BibTeX

@inproceedings{wang2015aaai-embedded,
  title     = {{Embedded Unsupervised Feature Selection}},
  author    = {Wang, Suhang and Tang, Jiliang and Liu, Huan},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2015},
  pages     = {470-476},
  doi       = {10.1609/AAAI.V29I1.9211},
  url       = {https://mlanthology.org/aaai/2015/wang2015aaai-embedded/}
}