A Bayesian Framework for Cross-Situational Word-Learning
Abstract
For infants, early word learning is a chicken-and-egg problem. One way to learn a word is to observe that it co-occurs with a particular referent across different situations. Another way is to use the social context of an utterance to infer the in- tended referent of a word. Here we present a Bayesian model of cross-situational word learning, and an extension of this model that also learns which social cues are relevant to determining reference. We test our model on a small corpus of mother-infant interaction and find it performs better than competing models. Fi- nally, we show that our model accounts for experimental phenomena including mutual exclusivity, fast-mapping, and generalization from social cues.
Cite
Text
Goodman et al. "A Bayesian Framework for Cross-Situational Word-Learning." Neural Information Processing Systems, 2007.Markdown
[Goodman et al. "A Bayesian Framework for Cross-Situational Word-Learning." Neural Information Processing Systems, 2007.](https://mlanthology.org/neurips/2007/goodman2007neurips-bayesian/)BibTeX
@inproceedings{goodman2007neurips-bayesian,
title = {{A Bayesian Framework for Cross-Situational Word-Learning}},
author = {Goodman, Noah and Tenenbaum, Joshua B. and Black, Michael J.},
booktitle = {Neural Information Processing Systems},
year = {2007},
pages = {457-464},
url = {https://mlanthology.org/neurips/2007/goodman2007neurips-bayesian/}
}