Rupp, L.H.* ; Kumar, A.* ; Sadeghi, M.* ; Schindler-Gmelch, L.* ; Keinert, M.* ; Eskofier, B.M. ; Berking, M.*
Stress can be detected during emotion-evoking smartphone use: A pilot study using machine learning.
Front. Digit. Health 7:1578917 (2025)
INTRODUCTION: The detrimental consequences of stress highlight the need for precise stress detection, as this offers a window for timely intervention. However, both objective and subjective measurements suffer from validity limitations. Contactless sensing technologies using machine learning methods present a potential alternative and could be used to estimate stress from externally visible physiological changes, such as emotional facial expressions. Although previous studies were able to classify stress from emotional expressions with accuracies of up to 88.32%, most works employed a classification approach and relied on data from contexts where stress was induced. Therefore, the primary aim of the present study was to clarify whether stress can be detected from facial expressions of six basic emotions (anxiety, anger, disgust, sadness, joy, love) and relaxation using a prediction approach. METHOD: To attain this goal, we analyzed video recordings of facial emotional expressions collected from n = 69 participants in a secondary analysis of a dataset from an interventional study. We aimed to explore associations with stress (assessed by the PSS-10 and a one-item stress measure). RESULTS: Comparing two regression machine learning models [Random Forest (RF) and XGBoost], we found that facial emotional expressions were promising indicators of stress scores, with model fit being best when data from all six emotional facial expressions was used to train the model (one-item stress measure: MSE (XGB) = 2.31, MAE (XGB) = 1.32, MSE (RF) = 3.86, MAE (RF) = 1.69; PSS-10: MSE (XGB) = 25.65, MAE (XGB) = 4.16, MSE (RF) = 26.32, MAE (RF) = 4.14). XGBoost showed to be more reliable for prediction, with lower error for both training and test data. DISCUSSION: The findings provide further evidence that non-invasive video recordings can complement standard objective and subjective markers of stress.
Impact Factor
Scopus SNIP
Web of Science
Times Cited
Scopus
Cited By
Altmetric
Publication type
Article: Journal article
Document type
Scientific Article
Thesis type
Editors
Keywords
Automated Stress Recognition ; Emotion ; Emotion Expression ; Machine Learning ; Stress; Facial Expressions; Gender-differences; Responses
Keywords plus
Language
english
Publication Year
2025
Prepublished in Year
0
HGF-reported in Year
2025
ISSN (print) / ISBN
2673-253X
e-ISSN
2673-253X
ISBN
Book Volume Title
Conference Title
Conference Date
Conference Location
Proceedings Title
Quellenangaben
Volume: 7,
Issue: ,
Pages: ,
Article Number: 1578917
Supplement: ,
Series
Publisher
Frontiers
Publishing Place
Avenue Du Tribunal Federal 34, Lausanne, Ch-1015, Switzerland
Day of Oral Examination
0000-00-00
Advisor
Referee
Examiner
Topic
University
University place
Faculty
Publication date
0000-00-00
Application date
0000-00-00
Patent owner
Further owners
Application country
Patent priority
Reviewing status
Peer reviewed
POF-Topic(s)
30205 - Bioengineering and Digital Health
Research field(s)
Enabling and Novel Technologies
PSP Element(s)
G-540008-001
Grants
German Research Foundation (Deutsche Forschungsgemeinschaft, DFG)
Bavarian Ministry of Science and Arts
Copyright
Erfassungsdatum
2025-05-16