Decoding-Based Regression

Abstract

Language models have recently been shown capable of performing regression wherein numeric predictions are represented as decoded strings. In this work, we provide theoretical grounds for this capability and furthermore investigate the utility of causal sequence decoding models as numeric regression heads given any feature representation. We find that, despite being trained in the usual way - for next-token prediction via cross-entropy loss - decoder-based heads are as performant as standard pointwise heads when benchmarked over standard regression tasks, while being flexible enough to capture smooth numeric distributions, such as in the task of density estimation.

Cite

Text

Song and Bahri. "Decoding-Based Regression." Transactions on Machine Learning Research, 2025.

Markdown

[Song and Bahri. "Decoding-Based Regression." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/song2025tmlr-decodingbased/)

BibTeX

@article{song2025tmlr-decodingbased,
  title     = {{Decoding-Based Regression}},
  author    = {Song, Xingyou and Bahri, Dara},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/song2025tmlr-decodingbased/}
}