Efficient Dynamic Batch Adaptation (Student Abstract)
Abstract
In this paper we introduce Efficient Dynamic Batch Adaptation (EDBA), which improves on a previous method that works by adjusting the composition and the size of the current batch. Our improvements allow for Dynamic Batch Adaptation to feasibly scale up for bigger models and datasets, drastically improving model convergence and generalization. We show how the method is still able to perform especially well in data-scarce scenarios, managing to obtain a test accuracy on 100 samples of CIFAR-10 of 90.68%, while the baseline only reaches 23.79%. On the full CIFAR-10 dataset, EDBA reaches convergence in ∼120 epochs while the baseline requires ∼300 epochs.
Cite
Text
Simionescu and Stoica. "Efficient Dynamic Batch Adaptation (Student Abstract)." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I13.27024Markdown
[Simionescu and Stoica. "Efficient Dynamic Batch Adaptation (Student Abstract)." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/simionescu2023aaai-efficient/) doi:10.1609/AAAI.V37I13.27024BibTeX
@inproceedings{simionescu2023aaai-efficient,
title = {{Efficient Dynamic Batch Adaptation (Student Abstract)}},
author = {Simionescu, Cristian and Stoica, George},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {16328-16329},
doi = {10.1609/AAAI.V37I13.27024},
url = {https://mlanthology.org/aaai/2023/simionescu2023aaai-efficient/}
}