möglich sobald bei der ZB eingereicht worden ist.
Evaluation of randomized input sampling for explanation (RISE) for 3D XAI - proof of concept for black-box brain-hemorrhage classification.
In: (Proceedings of 2023 International Conference on Medical Imaging and Computer-Aided Diagnosis (MICAD 2023)). 2024. 41-51 (Lecture Notes in Electrical Engineering ; 1166 LNEE)
An increasing number of AI products for medical imaging solutions are offered to healthcare organizations, but frequently these are considered to be a ‘black-box’, offering only limited insights into the AI model functionality. Therefore, model-agnostic methods are required to provide Explainable AI (XAI) in order to improve clinicians’ trust and thus accelerate adoption. However, there is a current lack of published methods to explain 3D classification models with systematic evaluation for medical imaging applications. Here, the popular explainability method RISE is modified so that, for the first time to the best of our knowledge, it can be applied to 3D medical image classification. The method was assessed using recently proposed guidelines for clinical explainable AI. When different parameters were tested using a 3D CT dataset and a classifier to detect the presence of brain hemorrhage, we found that combining different algorithms to produce 3D occlusion patterns led to better and more reliable explainability results. This was confirmed using both quantitative metrics and interpretability assessment of the 3D saliency heatmaps by a clinical expert.
Altmetric
Weitere Metriken?
Zusatzinfos bearbeiten
[➜Einloggen]
Publikationstyp
Artikel: Konferenzbeitrag
Schlagwörter
Explainable Ai ; Hemorrhage Classification ; Rise
ISSN (print) / ISBN
1876-1100
e-ISSN
1876-1119
Konferenztitel
Proceedings of 2023 International Conference on Medical Imaging and Computer-Aided Diagnosis (MICAD 2023)
Quellenangaben
Band: 1166 LNEE,
Seiten: 41-51
Nichtpatentliteratur
Publikationen
Institut(e)
Institute for Machine Learning in Biomed Imaging (IML)