PuSH - Publication Server of Helmholtz Zentrum München

Reisenbüchler, D. ; Wagner, S. ; Boxberg, M.* ; Peng, T.

Local attention graph-based transformer for multi-target genetic alteration prediction.

Lect. Notes Comput. Sc. 13432 LNCS, 377-386 (2022)
Postprint DOI
Open Access Green
Classical multiple instance learning (MIL) methods are often based on the identical and independent distributed assumption between instances, hence neglecting the potentially rich contextual information beyond individual entities. On the other hand, Transformers with global self-attention modules have been proposed to model the interdependencies among all instances. However, in this paper we question: Is global relation modeling using self-attention necessary, or can we appropriately restrict self-attention calculations to local regimes in large-scale whole slide images (WSIs)? We propose a general-purpose local attention graph-based Transformer for MIL (LA-MIL), introducing an inductive bias by explicitly contextualizing instances in adaptive local regimes of arbitrary size. Additionally, an efficiently adapted loss function enables our approach to learn expressive WSI embeddings for the joint analysis of multiple biomarkers. We demonstrate that LA-MIL achieves state-of-the-art results in mutation prediction for gastrointestinal cancer, outperforming existing models on important biomarkers such as microsatellite instability for colorectal cancer. Our findings suggest that local self-attention sufficiently models dependencies on par with global modules. Our LA-MIL implementation is available at https://github.com/agentdr1/LA_MIL.
Altmetric
Additional Metrics?
Edit extra informations Login
Publication type Article: Journal article
Document type Scientific Article
Corresponding Author
Keywords Graph Transformer ; Local Attention ; Multiple Instance Learning ; Mutation Prediction ; Whole Slide Images
ISSN (print) / ISBN 0302-9743
e-ISSN 1611-3349
Conference Title Medical Image Computing and Computer Assisted Intervention – MICCAI 2022
Quellenangaben Volume: 13432 LNCS, Issue: , Pages: 377-386 Article Number: , Supplement: ,
Publisher Springer
Publishing Place Berlin [u.a.]
Non-patent literature Publications