HW-GPT-Bench: Hardware-Aware Architecture Benchmark for Language Models

Abstract

The increasing size of language models necessitates a thorough analysis across multiple dimensions to assess trade-offs among crucial hardware metrics such as latency, energy consumption, GPU memory usage, and performance. Identifying optimal model configurations under specific hardware constraints is becoming essential but remains challenging due to the computational load of exhaustive training and evaluation on multiple devices. To address this, we introduce HW-GPT-Bench, a hardware-aware benchmark that utilizes surrogate predictions to approximate various hardware metrics across 13 devices of architectures in the GPT-2 family, with architectures containing up to 1.55B parameters. Our surrogates, via calibrated predictions and reliable uncertainty estimates, faithfully model the heteroscedastic noise inherent in the energy and latency measurements. To estimate perplexity, we employ weight-sharing techniques from Neural Architecture Search (NAS), inheriting pretrained weights from the largest GPT-2 model. Finally, we demonstrate the utility of HW-GPT-Bench by simulating optimization trajectories of various multi-objective optimization algorithms in just a few seconds.

Cite

Text

Sukthanker et al. "HW-GPT-Bench: Hardware-Aware Architecture Benchmark for Language Models." Neural Information Processing Systems, 2024. doi:10.52202/079017-1944

Markdown

[Sukthanker et al. "HW-GPT-Bench: Hardware-Aware Architecture Benchmark for Language Models." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/sukthanker2024neurips-hwgptbench/) doi:10.52202/079017-1944

BibTeX

@inproceedings{sukthanker2024neurips-hwgptbench,
  title     = {{HW-GPT-Bench: Hardware-Aware Architecture Benchmark for Language Models}},
  author    = {Sukthanker, Rhea Sanjay and Zela, Arber and Staffler, Benedikt and Klein, Aaron and Purucker, Lennart and Franke, Jörg K. H. and Hutter, Frank},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-1944},
  url       = {https://mlanthology.org/neurips/2024/sukthanker2024neurips-hwgptbench/}
}