PuSH - Publikationsserver des Helmholtz Zentrums München

Rupp, L.H.* ; Kumar, A.* ; Sadeghi, M.* ; Schindler-Gmelch, L.* ; Keinert, M.* ; Eskofier, B.M. ; Berking, M.*

Stress can be detected during emotion-evoking smartphone use: A pilot study using machine learning.

Front. Digit. Health 7:1578917 (2025)
Verlagsversion DOI PMC
Open Access Gold
Creative Commons Lizenzvertrag
INTRODUCTION: The detrimental consequences of stress highlight the need for precise stress detection, as this offers a window for timely intervention. However, both objective and subjective measurements suffer from validity limitations. Contactless sensing technologies using machine learning methods present a potential alternative and could be used to estimate stress from externally visible physiological changes, such as emotional facial expressions. Although previous studies were able to classify stress from emotional expressions with accuracies of up to 88.32%, most works employed a classification approach and relied on data from contexts where stress was induced. Therefore, the primary aim of the present study was to clarify whether stress can be detected from facial expressions of six basic emotions (anxiety, anger, disgust, sadness, joy, love) and relaxation using a prediction approach. METHOD: To attain this goal, we analyzed video recordings of facial emotional expressions collected from n = 69 participants in a secondary analysis of a dataset from an interventional study. We aimed to explore associations with stress (assessed by the PSS-10 and a one-item stress measure). RESULTS: Comparing two regression machine learning models [Random Forest (RF) and XGBoost], we found that facial emotional expressions were promising indicators of stress scores, with model fit being best when data from all six emotional facial expressions was used to train the model (one-item stress measure: MSE (XGB) = 2.31, MAE (XGB) = 1.32, MSE (RF) = 3.86, MAE (RF) = 1.69; PSS-10: MSE (XGB) = 25.65, MAE (XGB) = 4.16, MSE (RF) = 26.32, MAE (RF) = 4.14). XGBoost showed to be more reliable for prediction, with lower error for both training and test data. DISCUSSION: The findings provide further evidence that non-invasive video recordings can complement standard objective and subjective markers of stress.
Altmetric
Weitere Metriken?
Zusatzinfos bearbeiten [➜Einloggen]
Publikationstyp Artikel: Journalartikel
Dokumenttyp Wissenschaftlicher Artikel
Korrespondenzautor
Schlagwörter Automated Stress Recognition ; Emotion ; Emotion Expression ; Machine Learning ; Stress; Facial Expressions; Gender-differences; Responses
ISSN (print) / ISBN 2673-253X
e-ISSN 2673-253X
Quellenangaben Band: 7, Heft: , Seiten: , Artikelnummer: 1578917 Supplement: ,
Verlag Frontiers
Verlagsort Avenue Du Tribunal Federal 34, Lausanne, Ch-1015, Switzerland
Nichtpatentliteratur Publikationen
Begutachtungsstatus Peer reviewed
Institut(e) Institute of AI for Health (AIH)
Förderungen German Research Foundation (Deutsche Forschungsgemeinschaft, DFG)
Bavarian Ministry of Science and Arts