TY - JOUR AB - Large language models (LLMs) have the potential to revolutionize behavioral science by accelerating and improving the research cycle, from conceptualization to data analysis. Unlike closed-source solutions, open-source frameworks for LLMs can enable transparency, reproducibility, and adherence to data protection standards, which gives them a crucial advantage for use in behavioral science. To help researchers harness the promise of LLMs, this tutorial offers a primer on the open-source Hugging Face ecosystem and demonstrates several applications that advance conceptual and empirical work in behavioral science, including feature extraction, fine-tuning of models for prediction, and generation of behavioral responses. Executable code is made available at github.com/Zak-Hussain/LLM4BeSci.git . Finally, the tutorial discusses challenges faced by research with (open-source) LLMs related to interpretability and safety and offers a perspective on future research at the intersection of language modeling and behavioral science. AU - Hussain, Z.* AU - Binz, M. AU - Mata, R.* AU - Wulff, D.U.* C1 - 71487 C2 - 56210 CY - One New York Plaza, Suite 4600, New York, Ny, United States TI - A tutorial on open-source large language models for behavioral science. JO - Behav. Res. Methods PB - Springer PY - 2024 SN - 1554-351X ER - TY - JOUR AB - This paper presents a model that allows group comparisons of gaze behavior while watching dynamic video stimuli. The model is based on the approach of Coutrot and Guyader (2017) and allows linear combinations of feature maps to form a master saliency map. The feature maps in the model are, for example, the dynamically salient contents of a video stimulus or predetermined areas of interest. The model takes into account temporal aspects of the stimuli, which is a crucial difference to other common models. The multi-group extension of the model introduced here allows to obtain relative importance plots, which visualize the effect of a specific feature of a stimulus on the attention and visual behavior for two or more experimental groups. These plots are interpretable summaries of data with high spatial and temporal resolution. This approach differs from many common methods for comparing gaze behavior between natural groups, which usually only include single-dimensional features such as the duration of fixation on a particular part of the stimulus. The method is illustrated by contrasting a sample of a group of persons with particularly high cognitive abilities (high achievement on IQ tests) with a control group on a psycholinguistic task on the conceptualization of motion events. In the example, we find no substantive differences in relative importance, but more exploratory gaze behavior in the highly gifted group. The code, videos, and eye-tracking data we used for this study are available online. AU - Stadler, M. AU - Doebler, P.* AU - Mertins, B.* AU - Delucchi Danhier, R.* C1 - 62078 C2 - 50640 SP - 2650-2667 TI - Statistical modeling of dynamic eye-tracking experiments: Relative importance of visual stimulus elements for gaze behavior in the multi-group case. JO - Behav. Res. Methods VL - 53 IS - 6 PY - 2021 SN - 1554-351X ER -