Backpropagation Without Multiplication
Abstract
The back propagation algorithm has been modified to work with(cid:173) out any multiplications and to tolerate comput.ations with a low resolution, which makes it. more attractive for a hardware imple(cid:173) mentatioll. Numbers are represented in float.ing point format with 1 bit mantissa and 3 bits in the exponent for the states, and 1 bit mantissa and 5 bit exponent. for the gradients, while the weights are 16 bit fixed-point numbers. In this way, all the computations can be executed with shift and add operations. Large nehvorks with over 100,000 weights were t.rained and demonstrat.ed the same per(cid:173) formance as networks comput.ed with full precision. An estimate of a circuit implementatioll shows that a large network can be placed on a single chip , reaching more t.han 1 billion weight updat.es pel' second. A speedup is also obtained on any machine where a mul(cid:173) tiplication is slower than a shift operat.ioJl.
Cite
Text
Simard and Graf. "Backpropagation Without Multiplication." Neural Information Processing Systems, 1993.Markdown
[Simard and Graf. "Backpropagation Without Multiplication." Neural Information Processing Systems, 1993.](https://mlanthology.org/neurips/1993/simard1993neurips-backpropagation/)BibTeX
@inproceedings{simard1993neurips-backpropagation,
title = {{Backpropagation Without Multiplication}},
author = {Simard, Patrice Y. and Graf, Hans Peter},
booktitle = {Neural Information Processing Systems},
year = {1993},
pages = {232-239},
url = {https://mlanthology.org/neurips/1993/simard1993neurips-backpropagation/}
}