Sub-Ensembles for Fast Uncertainty Estimation in Neural Networks

Abstract

Fast estimates of model uncertainty are required for many robust robotics applications. Deep Ensembles provides state of the art uncertainty without requiring Bayesian methods, but still it is computationally expensive due to the use of large ensembles. In this paper we propose deep sub-ensembles, an approximation to deep ensembles where the core idea is to ensemble only a selection of layers close to the output, and not the whole model. This is motivated by feature hierarchy learned by convolutional networks that should allow for feature reuse across ensembles. With ResNet-20 on the CIFAR10 dataset, we obtain 1.5-2.5 speedup over a deep ensemble, with a small increase in error and loss, and similarly up to 5-15 speedup with a VGG-like network on the SVHN dataset. Our results show that this idea enables a trade-off between error and uncertainty quality versus computational performance as a sub-ensemble effectively works as an approximation of a deep ensemble.

Cite

Text

Valdenegro-Toro. "Sub-Ensembles for Fast Uncertainty Estimation in Neural Networks." IEEE/CVF International Conference on Computer Vision Workshops, 2023. doi:10.1109/ICCVW60793.2023.00445

Markdown

[Valdenegro-Toro. "Sub-Ensembles for Fast Uncertainty Estimation in Neural Networks." IEEE/CVF International Conference on Computer Vision Workshops, 2023.](https://mlanthology.org/iccvw/2023/valdenegrotoro2023iccvw-subensembles/) doi:10.1109/ICCVW60793.2023.00445

BibTeX

@inproceedings{valdenegrotoro2023iccvw-subensembles,
  title     = {{Sub-Ensembles for Fast Uncertainty Estimation in Neural Networks}},
  author    = {Valdenegro-Toro, Matias},
  booktitle = {IEEE/CVF International Conference on Computer Vision Workshops},
  year      = {2023},
  pages     = {4121-4129},
  doi       = {10.1109/ICCVW60793.2023.00445},
  url       = {https://mlanthology.org/iccvw/2023/valdenegrotoro2023iccvw-subensembles/}
}