Multi-Exit Resource-Efficient Neural Architecture for Image Classification with Optimized Fusion Block

Abstract

In this paper, we propose a test-time resource-efficient neural architecture for image classification. Building on MSDNet [12], our multi-exit architecture excels in both anytime classification, which allows progressive updates of predictions for test examples and facilitates early output, and budgeted batch classification, which allows flexible allocation of computational resources across inputs to classify a set of examples within a fixed budget. Our proposed multi-exit architecture achieves state-of-the-art performance on CIFAR10 and CIFAR100 in these two critical scenarios, thanks to a novel feature fusion building block combined with an efficient stem block.

Cite

Text

Addad et al. "Multi-Exit Resource-Efficient Neural Architecture for Image Classification with Optimized Fusion Block." IEEE/CVF International Conference on Computer Vision Workshops, 2023. doi:10.1109/ICCVW60793.2023.00161

Markdown

[Addad et al. "Multi-Exit Resource-Efficient Neural Architecture for Image Classification with Optimized Fusion Block." IEEE/CVF International Conference on Computer Vision Workshops, 2023.](https://mlanthology.org/iccvw/2023/addad2023iccvw-multiexit/) doi:10.1109/ICCVW60793.2023.00161

BibTeX

@inproceedings{addad2023iccvw-multiexit,
  title     = {{Multi-Exit Resource-Efficient Neural Architecture for Image Classification with Optimized Fusion Block}},
  author    = {Addad, Youva and Lechervy, Alexis and Jurie, Frédéric},
  booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
  year      = {2023},
  pages     = {1478-1483},
  doi       = {10.1109/ICCVW60793.2023.00161},
  url       = {https://mlanthology.org/iccvw/2023/addad2023iccvw-multiexit/}
}