Rupp, L.H.* ; Kumar, A.* ; Sadeghi, M.* ; Schindler-Gmelch, L.* ; Keinert, M.* ; Eskofier, B.M. ; Berking, M.*
Stress can be detected during emotion-evoking smartphone use: A pilot study using machine learning.
Front. Digit. Health 7:1578917 (2025)
INTRODUCTION: The detrimental consequences of stress highlight the need for precise stress detection, as this offers a window for timely intervention. However, both objective and subjective measurements suffer from validity limitations. Contactless sensing technologies using machine learning methods present a potential alternative and could be used to estimate stress from externally visible physiological changes, such as emotional facial expressions. Although previous studies were able to classify stress from emotional expressions with accuracies of up to 88.32%, most works employed a classification approach and relied on data from contexts where stress was induced. Therefore, the primary aim of the present study was to clarify whether stress can be detected from facial expressions of six basic emotions (anxiety, anger, disgust, sadness, joy, love) and relaxation using a prediction approach. METHOD: To attain this goal, we analyzed video recordings of facial emotional expressions collected from n = 69 participants in a secondary analysis of a dataset from an interventional study. We aimed to explore associations with stress (assessed by the PSS-10 and a one-item stress measure). RESULTS: Comparing two regression machine learning models [Random Forest (RF) and XGBoost], we found that facial emotional expressions were promising indicators of stress scores, with model fit being best when data from all six emotional facial expressions was used to train the model (one-item stress measure: MSE (XGB) = 2.31, MAE (XGB) = 1.32, MSE (RF) = 3.86, MAE (RF) = 1.69; PSS-10: MSE (XGB) = 25.65, MAE (XGB) = 4.16, MSE (RF) = 26.32, MAE (RF) = 4.14). XGBoost showed to be more reliable for prediction, with lower error for both training and test data. DISCUSSION: The findings provide further evidence that non-invasive video recordings can complement standard objective and subjective markers of stress.
Impact Factor
Scopus SNIP
Web of Science
Times Cited
Scopus
Cited By
Altmetric
Publikationstyp
Artikel: Journalartikel
Dokumenttyp
Wissenschaftlicher Artikel
Typ der Hochschulschrift
Herausgeber
Schlagwörter
Automated Stress Recognition ; Emotion ; Emotion Expression ; Machine Learning ; Stress; Facial Expressions; Gender-differences; Responses
Keywords plus
Sprache
englisch
Veröffentlichungsjahr
2025
Prepublished im Jahr
0
HGF-Berichtsjahr
2025
ISSN (print) / ISBN
2673-253X
e-ISSN
2673-253X
ISBN
Bandtitel
Konferenztitel
Konferzenzdatum
Konferenzort
Konferenzband
Quellenangaben
Band: 7,
Heft: ,
Seiten: ,
Artikelnummer: 1578917
Supplement: ,
Reihe
Verlag
Frontiers
Verlagsort
Avenue Du Tribunal Federal 34, Lausanne, Ch-1015, Switzerland
Tag d. mündl. Prüfung
0000-00-00
Betreuer
Gutachter
Prüfer
Topic
Hochschule
Hochschulort
Fakultät
Veröffentlichungsdatum
0000-00-00
Anmeldedatum
0000-00-00
Anmelder/Inhaber
weitere Inhaber
Anmeldeland
Priorität
Begutachtungsstatus
Peer reviewed
POF Topic(s)
30205 - Bioengineering and Digital Health
Forschungsfeld(er)
Enabling and Novel Technologies
PSP-Element(e)
G-540008-001
Förderungen
German Research Foundation (Deutsche Forschungsgemeinschaft, DFG)
Bavarian Ministry of Science and Arts
Copyright
Erfassungsdatum
2025-05-16