Self-Paced Boost Learning for Classification

Abstract

Effectiveness and robustness are two essential aspects of supervised learning studies. For effective learning, ensemble methods are developed to build a strong effective model from ensemble of weak models. For robust learning, self-paced learning (SPL) is proposed to learn in a self-controlled pace from easy samples to complex ones. Motivated by simultaneously enhancing the learning effectiveness and robustness, we propose a unified framework, Self-Paced Boost Learning (SPBL). With an adaptive from-easy-to-hard pace in boosting process, SPBL asymptotically guides the model to focus more on the insufficiently learned samples with higher reliability. Via a max-margin boosting optimization with self-paced sample selection, SPBL is capable of capturing the intrinsic inter-class discriminative patterns while ensuring the reliability of the samples involved in learning. We formulate SPBL as a fully-corrective optimization for classification. The experiments on several real-world datasets show the superiority of SPBL in terms of both effectiveness and robustness. PDF

Cite

Text

Pi et al. "Self-Paced Boost Learning for Classification." International Joint Conference on Artificial Intelligence, 2016.

Markdown

[Pi et al. "Self-Paced Boost Learning for Classification." International Joint Conference on Artificial Intelligence, 2016.](https://mlanthology.org/ijcai/2016/pi2016ijcai-self/)

BibTeX

@inproceedings{pi2016ijcai-self,
  title     = {{Self-Paced Boost Learning for Classification}},
  author    = {Pi, Te and Li, Xi and Zhang, Zhongfei and Meng, Deyu and Wu, Fei and Xiao, Jun and Zhuang, Yueting},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2016},
  pages     = {1932-1938},
  url       = {https://mlanthology.org/ijcai/2016/pi2016ijcai-self/}
}