Analog Foundation Models

Abstract

Analog in-memory computing (AIMC) is a promising compute paradigm to improve speed and power efficiency of neural network inference beyond the limits of conventional von Neumann-based architectures. However, AIMC introduces fundamental challenges such as noisy computations and strict constraints on input and output quantization. Because of these constraints and imprecisions, off-the-shelf LLMs are not able to achieve 4-bit-level performance when deployed on AIMC-based hardware. While researchers previously investigated recovering this accuracy gap on small, mostly vision-based models, a generic method applicable to LLMs pre-trained on trillions of tokens does not yet exist. In this work, we introduce a general and scalable method to robustly adapt LLMs for execution on noisy, low-precision analog hardware. Our approach enables state-of-the-art models — including Phi-3-mini-4k-instruct and Llama-3.2-1B-Instruct — to retain performance comparable to 4-bit weight, 8-bit activation baselines, despite the presence of analog noise and quantization constraints. Additionally, we show that as a byproduct of our training methodology, analog foundation models can be quantized for inference on low-precision digital hardware. Finally, we show that our models also benefit from test-time compute scaling, showing better scaling behavior than models trained with 4-bit weight and 8-bit static input quantization. Our work bridges the gap between high-capacity LLMs and efficient analog hardware, offering a path toward energy-efficient foundation models. Code is available at [github.com/IBM/analog-foundation-models](https://github.com/IBM/analog-foundation-models).

Cite

Text

Büchel et al. "Analog Foundation Models." Advances in Neural Information Processing Systems, 2025.

Markdown

[Büchel et al. "Analog Foundation Models." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/buchel2025neurips-analog/)

BibTeX

@inproceedings{buchel2025neurips-analog,
  title     = {{Analog Foundation Models}},
  author    = {Büchel, Julian and Chalas, Iason and Acampa, Giovanni and Chen, An and Fagbohungbe, Omobayode and Tsai, Hsinyu and El Maghraoui, Kaoutar and Le Gallo, Manuel and Rahimi, Abbas and Sebastian, Abu},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/buchel2025neurips-analog/}
}