On Modulating the Gradient for Meta-Learning

Abstract

Inspired by optimization techniques, we propose a novel meta-learning algorithm with gradient modulation to encourage fast-adaptation of neural networks in the absence of abundant data. Our method, termed ModGrad, is designed to circumvent the noisy nature of the gradients which is prevalent in low-data regimes. Furthermore and having the scalability concern in mind, we formulate ModGrad via low-rank approximations, which in turn enables us to employ ModGrad to adapt hefty neural networks. We thoroughly assess and contrast ModGrad against a large family of meta-learning techniques and observe that the proposed algorithm outperforms baselines comfortably while enjoying faster convergence.

Cite

Text

Simon et al. "On Modulating the Gradient for Meta-Learning." Proceedings of the European Conference on Computer Vision (ECCV), 2020. doi:10.1007/978-3-030-58598-3_33

Markdown

[Simon et al. "On Modulating the Gradient for Meta-Learning." Proceedings of the European Conference on Computer Vision (ECCV), 2020.](https://mlanthology.org/eccv/2020/simon2020eccv-modulating/) doi:10.1007/978-3-030-58598-3_33

BibTeX

@inproceedings{simon2020eccv-modulating,
  title     = {{On Modulating the Gradient for Meta-Learning}},
  author    = {Simon, Christian and Koniusz, Piotr and Nock, Richard and Harandi, Mehrtash},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2020},
  doi       = {10.1007/978-3-030-58598-3_33},
  url       = {https://mlanthology.org/eccv/2020/simon2020eccv-modulating/}
}