Energy-Efficient Random Number Generation Using Stochastic Magnetic Tunnel Junctions
Abstract
(Pseudo)random sampling is a costly yet widely used method in machine learning. We introduce an energy-efficient algorithm for uniform Float16 sampling, utilizing a room-temperature stochastic magnetic tunnel junction device to generate truly random floating-point numbers. By avoiding expensive symbolic computation and mapping physical phenomena directly to the statistical properties of the floating-point format and uniform distribution, our approach achieves a higher level of energy efficiency than the state-of-the-art Mersenne-Twister algorithm by a minimum factor of 9721 and an improvement factor of 5649 compared to the more energy-efficient PCG algorithm. We provide measurements of the potential accumulated approximation errors, demonstrating the effectiveness of our method.
Cite
Text
Alder et al. "Energy-Efficient Random Number Generation Using Stochastic Magnetic Tunnel Junctions." NeurIPS 2024 Workshops: MLNCP, 2024.Markdown
[Alder et al. "Energy-Efficient Random Number Generation Using Stochastic Magnetic Tunnel Junctions." NeurIPS 2024 Workshops: MLNCP, 2024.](https://mlanthology.org/neuripsw/2024/alder2024neuripsw-energyefficient/)BibTeX
@inproceedings{alder2024neuripsw-energyefficient,
title = {{Energy-Efficient Random Number Generation Using Stochastic Magnetic Tunnel Junctions}},
author = {Alder, Nicolas and Kajale, Shivam Nitin and Tunsiricharoengul, Milin and Sarkar, Deblina and Herbrich, Ralf},
booktitle = {NeurIPS 2024 Workshops: MLNCP},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/alder2024neuripsw-energyefficient/}
}