Information-Theoretic Neural Decoding Reproduces Several Laws of Human Behavior

Abstract

Features of tasks and environments are often represented in the brain by neural firing rates. Representations must be decoded to enable downstream actions, and decoding takes time. We describe a toy model with a Poisson process encoder and an ideal observer Bayesian decoder, and show the decoding of rate-coded signals reproduces classic patterns of response time and accuracy observed in humans, including the Hick-Hyman Law, the Power Law of Learning, speed-accuracy trade-offs, and response times matching lognormal distributions. The decoder is equipped with a codebook, a prior distribution over signals, and an entropy stopping threshold. We argue that historical concerns of the applicability of such information-theoretic tools to neural and behavioral data arises from a confusion about the application of discrete-time coding techniques to continuous-time signals.

Cite

Text

Christie and Schrater. "Information-Theoretic Neural Decoding Reproduces Several Laws of Human Behavior." NeurIPS 2022 Workshops: InfoCog, 2022.

Markdown

[Christie and Schrater. "Information-Theoretic Neural Decoding Reproduces Several Laws of Human Behavior." NeurIPS 2022 Workshops: InfoCog, 2022.](https://mlanthology.org/neuripsw/2022/christie2022neuripsw-informationtheoretic/)

BibTeX

@inproceedings{christie2022neuripsw-informationtheoretic,
  title     = {{Information-Theoretic Neural Decoding Reproduces Several Laws of Human Behavior}},
  author    = {Christie, S. Thomas and Schrater, Paul R.},
  booktitle = {NeurIPS 2022 Workshops: InfoCog},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/christie2022neuripsw-informationtheoretic/}
}