Non-Negative Semi-Supervised Learning
Abstract
The contributions of this paper are three-fold. First, we present a general formulation for reaping the benefits from both non-negative data factorization and semi-supervised learning, and the solution naturally possesses the characteristics of sparsity, robustness to partial occlusions, and greater discriminating power via extra unlabeled data. Then, an efficient multiplicative updating procedure is proposed along with its theoretic justification of the algorithmic convergency. Finally, the tensorization of this general formulation for non-negative semi-supervised learning is also briefed for handling tensor data of arbitrary order. Extensive experiments compared with the state-of-the-art algorithms for non-negative data factorization and semi-supervised learning demonstrate the algorithmic properties in sparsity, classification power, and robustness to image occlusions.
Cite
Text
Wang et al. "Non-Negative Semi-Supervised Learning." Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, 2009.Markdown
[Wang et al. "Non-Negative Semi-Supervised Learning." Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, 2009.](https://mlanthology.org/aistats/2009/wang2009aistats-nonnegative/)BibTeX
@inproceedings{wang2009aistats-nonnegative,
title = {{Non-Negative Semi-Supervised Learning}},
author = {Wang, Changhu and Yan, Shuicheng and Zhang, Lei and Zhang, Hongjiang},
booktitle = {Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics},
year = {2009},
pages = {575-582},
volume = {5},
url = {https://mlanthology.org/aistats/2009/wang2009aistats-nonnegative/}
}