Channel Selection Using Gumbel SoftMax
Abstract
Important applications such as mobile computing require reducing the computational costs of neural network inference. Ideally, applications would specify their preferred tradeoff between accuracy and speed, and the network would optimize this end-to-end, using classification error to remove parts of the network. Increasing speed can be done either during training – e.g., pruning filters – or during inference – e.g., conditionally executing a subset of the layers. We propose a single end-to-end framework that can improve inference efficiency in both settings. We use a combination of batch activation loss and classification loss, and Gumbel reparameterization to learn network structure. We train end-to-end, and the same technique supports pruning as well as conditional computation. We obtain promising experimental results for ImageNet classification with ResNet (45-52% less computation).
Cite
Text
Herrmann et al. "Channel Selection Using Gumbel SoftMax." Proceedings of the European Conference on Computer Vision (ECCV), 2020. doi:10.1007/978-3-030-58583-9_15Markdown
[Herrmann et al. "Channel Selection Using Gumbel SoftMax." Proceedings of the European Conference on Computer Vision (ECCV), 2020.](https://mlanthology.org/eccv/2020/herrmann2020eccv-channel/) doi:10.1007/978-3-030-58583-9_15BibTeX
@inproceedings{herrmann2020eccv-channel,
title = {{Channel Selection Using Gumbel SoftMax}},
author = {Herrmann, Charles and Bowen, Richard Strong and Zabih, Ramin},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2020},
doi = {10.1007/978-3-030-58583-9_15},
url = {https://mlanthology.org/eccv/2020/herrmann2020eccv-channel/}
}