The Simpler the Better: An Entropy-Based Importance Metric to Reduce Neural Networks' Depth
Abstract
While deep neural networks are highly effective at solving complex tasks, large pre-trained models are commonly employed even to solve consistently simpler downstream tasks, which do not necessarily require a large model’s complexity. Motivated by the awareness of the ever-growing AI environmental impact, we propose an efficiency strategy that leverages prior knowledge transferred by large models. Simple but effective, we propose a method relying on an E ntropy-b AS ed I mportance m E t R ic ( EASIER ) to reduce the depth of over-parametrized deep neural networks, which alleviates their computational burden. We assess the effectiveness of our method on traditional image classification setups. Our code is available at https://github.com/VGCQ/EASIER .
Cite
Text
Quétu et al. "The Simpler the Better: An Entropy-Based Importance Metric to Reduce Neural Networks' Depth." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2024. doi:10.1007/978-3-031-70365-2_6Markdown
[Quétu et al. "The Simpler the Better: An Entropy-Based Importance Metric to Reduce Neural Networks' Depth." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2024.](https://mlanthology.org/ecmlpkdd/2024/quetu2024ecmlpkdd-simpler/) doi:10.1007/978-3-031-70365-2_6BibTeX
@inproceedings{quetu2024ecmlpkdd-simpler,
title = {{The Simpler the Better: An Entropy-Based Importance Metric to Reduce Neural Networks' Depth}},
author = {Quétu, Victor and Liao, Zhu and Tartaglione, Enzo},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2024},
pages = {92-108},
doi = {10.1007/978-3-031-70365-2_6},
url = {https://mlanthology.org/ecmlpkdd/2024/quetu2024ecmlpkdd-simpler/}
}