Calibrating CNNs for Few-Shot Meta Learning

Abstract

Although few-shot meta learning has been extensively studied in machine learning community, the fast adaptation towards new tasks remains a challenge in the few-shot learning scenario. The neuroscience research reveals that the capability of evolving neural network formulation is essential for task adaptation, which has been broadly studied in recent meta-learning researches. In this paper, we present a novel forward-backward meta-learning framework (FBM) to facilitate the model generalization in few-shot learning from a new perspective, i.e., neuron calibration. In particular, FBM models the neurons in deep neural network-based model as calibrated units under a general formulation, where neuron calibration could empower fast adaptation capability to the neural network-based models through influencing both their forward inference path and backward propagation path. The proposed calibration scheme is lightweight and applicable to various feed-forward neural network architectures. Extensive empirical experiments on the challenging few-shot learning benchmarks validate that our approach training with neuron calibration achieves a promising performance, which demonstrates that neuron calibration plays a vital role in improving the few-shot learning performance.

Cite

Text

Yang et al. "Calibrating CNNs for Few-Shot Meta Learning." Winter Conference on Applications of Computer Vision, 2022.

Markdown

[Yang et al. "Calibrating CNNs for Few-Shot Meta Learning." Winter Conference on Applications of Computer Vision, 2022.](https://mlanthology.org/wacv/2022/yang2022wacv-calibrating/)

BibTeX

@inproceedings{yang2022wacv-calibrating,
  title     = {{Calibrating CNNs for Few-Shot Meta Learning}},
  author    = {Yang, Peng and Ren, Shaogang and Zhao, Yang and Li, Ping},
  booktitle = {Winter Conference on Applications of Computer Vision},
  year      = {2022},
  pages     = {2090-2099},
  url       = {https://mlanthology.org/wacv/2022/yang2022wacv-calibrating/}
}