Self-Paced Convolutional Neural Networks

Abstract

Convolutional neural networks (CNNs) have achieved breakthrough performance in many pattern recognition tasks. In order to distinguish the reliable data from the noisy and confusing data, we improve CNNs with self-paced learning (SPL) for enhancing the learning robustness of CNNs. In the proposed self-paced convolutional network (SPCN), each sample is assigned to a weight to reflect the easiness of the sample. Then a dynamic self-paced function is incorporated into the leaning objective of CNN to jointly learn the parameters of CNN and the latent weight variable. SPCN learns the samples from easy to complex and the sample weights can dynamically control the learning rates for converging to better values. To gain more insights of SPCN, theoretical studies are conducted to show that SPCN converges to a stationary solution and is robust to the noisy and confusing data. Experimental results on MNIST and rectangles datasets demonstrate that the proposed method outperforms baseline methods.

Cite

Text

Li and Gong. "Self-Paced Convolutional Neural Networks." International Joint Conference on Artificial Intelligence, 2017. doi:10.24963/IJCAI.2017/293

Markdown

[Li and Gong. "Self-Paced Convolutional Neural Networks." International Joint Conference on Artificial Intelligence, 2017.](https://mlanthology.org/ijcai/2017/li2017ijcai-self/) doi:10.24963/IJCAI.2017/293

BibTeX

@inproceedings{li2017ijcai-self,
  title     = {{Self-Paced Convolutional Neural Networks}},
  author    = {Li, Hao and Gong, Maoguo},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {2110-2116},
  doi       = {10.24963/IJCAI.2017/293},
  url       = {https://mlanthology.org/ijcai/2017/li2017ijcai-self/}
}