Abstract
We examine working memory use and incrementality using a cognitive model of grammatical encoding. Our model combines an empirically validated framework, ACT-R, with a linguistic theory, Combinatory Categorial Grammar, to target that phase of language production. By building the model with the Switchboard corpus, it can attempt to realize a larger set of sentences. With this methodology, different strategies may be compared according to the similarity of the model's sentences to the test sentences. In this way, the model can still be evaluated by its fit to human data, without overfitting to individual experiments. The results show that while having more working memory available improves performance, using less working memory during realization is correlated with a closer fit, even after controlling for sentence complexity. Further, sentences realized with a more incremental strategy are also more similar to the corpus sentences as measured by edit distance. As high incrementality is correlated with low working memory usage, this study offers a possible mechanism by which incrementality can be explained.
Original language | English (US) |
---|---|
Pages | 211-216 |
Number of pages | 6 |
State | Published - 2017 |
Event | 15th International Conference on Cognitive Modeling, ICCM 2017 - Coventry, United Kingdom Duration: Jul 22 2017 → Jul 25 2017 |
Conference
Conference | 15th International Conference on Cognitive Modeling, ICCM 2017 |
---|---|
Country/Territory | United Kingdom |
City | Coventry |
Period | 7/22/17 → 7/25/17 |
All Science Journal Classification (ASJC) codes
- Modeling and Simulation
- Artificial Intelligence