'Less than One'-Shot Learning: Learning N Classes from M < N Samples

Abstract

Deep neural networks require large training sets but suffer from high computational cost and long training times. Training on much smaller training sets while maintaining nearly the same accuracy would be very beneficial. In the few-shot learning setting, a model must learn a new class given only a small number of samples from that class. One-shot learning is an extreme form of few-shot learning where the model must learn a new class from a single example. We propose the 'less than one'-shot learning task where models must learn N new classes given only M

Cite

Text

Sucholutsky and Schonlau. "'Less than One'-Shot Learning: Learning N Classes from M < N Samples." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I11.17171

Markdown

[Sucholutsky and Schonlau. "'Less than One'-Shot Learning: Learning N Classes from M < N Samples." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/sucholutsky2021aaai-less/) doi:10.1609/AAAI.V35I11.17171

BibTeX

@inproceedings{sucholutsky2021aaai-less,
  title     = {{'Less than One'-Shot Learning: Learning N Classes from M < N Samples}},
  author    = {Sucholutsky, Ilia and Schonlau, Matthias},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {9739-9746},
  doi       = {10.1609/AAAI.V35I11.17171},
  url       = {https://mlanthology.org/aaai/2021/sucholutsky2021aaai-less/}
}