Deep Growing Learning

Abstract

Semi-supervised learning (SSL) is an import paradigm to make full use of a large amount of unlabeled data in machine learning. A bottleneck of SSL is the overfitting problem when training over the limited labeled data, especially on a complex model like a deep neural network. To get around this bottleneck, we propose a bio-inspired SSL framework on deep neural network, namely Deep Growing Learning (DGL). Specifically, we formulate the SSL as an EM-like process, where the deep network alternately iterates between automatically growing convolutional layers and selecting reliable pseudo-labeled data for training. The DGL guarantees that a shallow neural network is trained with labeled data, while a deeper neural network is trained with growing amount of reliable pseudo-labeled data, so as to alleviate the overfitting problem. Experiments on different visual recognition tasks have verified the effectiveness of DGL.

Cite

Text

Wang et al. "Deep Growing Learning." International Conference on Computer Vision, 2017. doi:10.1109/ICCV.2017.306

Markdown

[Wang et al. "Deep Growing Learning." International Conference on Computer Vision, 2017.](https://mlanthology.org/iccv/2017/wang2017iccv-deep-b/) doi:10.1109/ICCV.2017.306

BibTeX

@inproceedings{wang2017iccv-deep-b,
  title     = {{Deep Growing Learning}},
  author    = {Wang, Guangcong and Xie, Xiaohua and Lai, Jianhuang and Zhuo, Jiaxuan},
  booktitle = {International Conference on Computer Vision},
  year      = {2017},
  doi       = {10.1109/ICCV.2017.306},
  url       = {https://mlanthology.org/iccv/2017/wang2017iccv-deep-b/}
}