Meta-Learning with Network Pruning
Abstract
Meta-learning is a powerful paradigm for few-shot learning. Although with remarkable success witnessed in many applications, the existing optimization based meta-learning models with over-parameterized neural networks have been evidenced to ovetfit on training tasks. To remedy this deficiency, we propose a network pruning based meta-learning approach for overfitting reduction via explicitly controlling the capacity of network. A uniform concentration analysis reveals the benefit of capacity constraint for reducing generalization gap of the proposed meta-learner. We have implemented our approach on top of Reptile assembled with two network pruning routines: Dense-Sparse-Dense (DSD) and Iterative Hard Thresholding (IHT). Extensive experimental results on benchmark datasets with different over-parameterized deep networks demonstrate that our method not only effectively alleviates meta-overfitting but also in many cases improves the overall generalization performance when applied to few-shot classification tasks.
Cite
Text
Tian et al. "Meta-Learning with Network Pruning." Proceedings of the European Conference on Computer Vision (ECCV), 2020. doi:10.1007/978-3-030-58529-7_40Markdown
[Tian et al. "Meta-Learning with Network Pruning." Proceedings of the European Conference on Computer Vision (ECCV), 2020.](https://mlanthology.org/eccv/2020/tian2020eccv-metalearning/) doi:10.1007/978-3-030-58529-7_40BibTeX
@inproceedings{tian2020eccv-metalearning,
title = {{Meta-Learning with Network Pruning}},
author = {Tian, Hongduan and Liu, Bo and Yuan, Xiao-Tong and Liu, Qingshan},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2020},
doi = {10.1007/978-3-030-58529-7_40},
url = {https://mlanthology.org/eccv/2020/tian2020eccv-metalearning/}
}