Natural Language Systematicity from a Constraint on Excess Entropy
Abstract
Natural language is systematic: utterances are composed of individually meaningful parts which are typically concatenated together. I argue that natural-language-like systematicity arises in codes when they are constrained by excess entropy, the mutual information between the past and the future of a process. In three examples, I show that codes with natural-language-like systematicity have lower excess entropy than matched alternatives.
Cite
Text
Futrell. "Natural Language Systematicity from a Constraint on Excess Entropy." NeurIPS 2023 Workshops: InfoCog, 2023.Markdown
[Futrell. "Natural Language Systematicity from a Constraint on Excess Entropy." NeurIPS 2023 Workshops: InfoCog, 2023.](https://mlanthology.org/neuripsw/2023/futrell2023neuripsw-natural/)BibTeX
@inproceedings{futrell2023neuripsw-natural,
title = {{Natural Language Systematicity from a Constraint on Excess Entropy}},
author = {Futrell, Richard},
booktitle = {NeurIPS 2023 Workshops: InfoCog},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/futrell2023neuripsw-natural/}
}