An Analytical Approach to Enhancing DNN Efficiency and Accuracy Using Approximate Multiplication
Abstract
Achieving higher accuracy in Deep Neural Networks (DNNs) often reaches a plateau despite extensive training, retraining, and fine-tuning. This paper introduces an analytical study using approximate multipliers to investigate potential accuracy improvements. Leveraging the principles of the Information Bottleneck (IB) theory, we analyze the enhanced information and feature extraction capabilities provided by approximate multipliers. Through Information Plane (IP) analysis, we gain a detailed understanding of DNN behavior under this approach. Our analysis indicates that this technique can break through existing accuracy barriers while offering computational and energy efficiency benefits. Compared to traditional methods that are computationally intensive, our approach uses less demanding optimization techniques. Additionally, approximate multipliers contribute to reduced energy consumption during both the training and inference phases. Experimental results support the potential of this method, suggesting it is a promising direction for DNN optimization.
Cite
Text
Shakibhamedan et al. "An Analytical Approach to Enhancing DNN Efficiency and Accuracy Using Approximate Multiplication." ICML 2024 Workshops: WANT, 2024.Markdown
[Shakibhamedan et al. "An Analytical Approach to Enhancing DNN Efficiency and Accuracy Using Approximate Multiplication." ICML 2024 Workshops: WANT, 2024.](https://mlanthology.org/icmlw/2024/shakibhamedan2024icmlw-analytical/)BibTeX
@inproceedings{shakibhamedan2024icmlw-analytical,
title = {{An Analytical Approach to Enhancing DNN Efficiency and Accuracy Using Approximate Multiplication}},
author = {Shakibhamedan, Salar and Jahanjoo, Anice and Aminifar, Amin and Amirafshar, Nima and TaheriNejad, Nima and Jantsch, Axel},
booktitle = {ICML 2024 Workshops: WANT},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/shakibhamedan2024icmlw-analytical/}
}