MLPs Learn In-Context on Regression and Classification Tasks

Abstract

In-context learning (ICL), the remarkable ability to solve a task from only input exemplars, is often assumed to be a unique hallmark of Transformer models. By examining commonly employed synthetic ICL tasks, we demonstrate that multi-layer perceptrons (MLPs) can also learn in-context. Moreover, MLPs, and the closely related MLP-Mixer models, learn in-context comparably with Transformers under the same compute budget in this setting. We further show that MLPs outperform Transformers on a series of classical tasks from psychology designed to test relational reasoning, which are closely related to in-context classification. These results underscore a need for studying in-context learning beyond attention-based architectures, while also challenging prior arguments against MLPs' ability to solve relational tasks. Altogether, our results highlight the unexpected competence of MLPs in a synthetic setting, and support the growing interest in all-MLP alternatives to Transformer architectures. It remains unclear how MLPs perform against Transformers at scale on real-world tasks, and where a performance gap may originate. We encourage further exploration of these architectures in more complex settings to better understand the potential comparative advantage of attention-based schemes.

Cite

Text

Tong and Pehlevan. "MLPs Learn In-Context on Regression and Classification Tasks." International Conference on Learning Representations, 2025.

Markdown

[Tong and Pehlevan. "MLPs Learn In-Context on Regression and Classification Tasks." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/tong2025iclr-mlps/)

BibTeX

@inproceedings{tong2025iclr-mlps,
  title     = {{MLPs Learn In-Context on Regression and Classification Tasks}},
  author    = {Tong, William Lingxiao and Pehlevan, Cengiz},
  booktitle = {International Conference on Learning Representations},
  year      = {2025},
  url       = {https://mlanthology.org/iclr/2025/tong2025iclr-mlps/}
}