PuSH - Publikationsserver des Helmholtz Zentrums München

Meghdadi, M. ; Duff, J.* ; Demberg, V.*

Integrating language model embeddings into the ACT-R cognitive modeling framework.

Front. Lang. Sci. 5:1721326 (2026)
Verlagsversion DOI
Open Access Gold
Creative Commons Lizenzvertrag
In 2025, psycholinguistic research has the benefit of large, high-quality datasets of human behavior, and massively-scalable metrics for variables of interest like frequency and association. This means we have more data than ever before to shed light on classic language processing phenomena like associative priming. But in order to build and test rigorous theories against this data, we also need computational modeling tools that can simulate cognitive mechanisms and generate quantitative predictions at the same scale. In this paper, we assemble one such case, adapting the ACT-R cognitive modeling framework to make use of association metrics derived from language model embeddings, in service of a scalable model of associative priming in the Lexical Decision Task. ACT-R implements a model of memory retrieval that can use itemwise predictors like frequency and association to predict task response times (RTs), via interpretable and meaningfully-parameterized components like spreading activation. But currently, ACT-R's spreading activation calculations rely on manually-coded similarity scores, which are labor-intensive and prone to inaccuracies, particularly for large vocabularies. In this study, we replace these hand-coded associations with cosine similarity scores derived from Word2Vec and BERT embeddings, thereby improving both scalability and predictive accuracy while retaining ACT-R's interpretability. We compare various versions of our model against observed human RTs from the Semantic Priming Project dataset, observing impressive item-wise prediction accuracy, and achieving the strongest alignment with a model where spreading activation is penalized via a scalable approximation of the classic “fan effect.” These findings provide a proof of concept for integrating embedding-based representations into algorithmic-level models of language processing. More than an insight into models of priming, we see this as a first step toward scalable and specific models of more complex phenomena.
Altmetric
Weitere Metriken?
Zusatzinfos bearbeiten [➜Einloggen]
Publikationstyp Artikel: Journalartikel
Dokumenttyp Wissenschaftlicher Artikel
Schlagwörter Act-r ; Associative Priming ; Cognitive Modeling ; Distributional Semantics ; Language Models ; Psycholinguistics
ISSN (print) / ISBN 2813-4605
e-ISSN 2813-4605
Quellenangaben Band: 5, Heft: , Seiten: , Artikelnummer: 1721326 Supplement: ,
Verlag Frontiers Media S.A.
Begutachtungsstatus Peer reviewed