Dokumente im Korb
Helmholtz Zentrum München
|
Impressum
PuSH - Publikationsserver des Helmholtz Zentrums München
Navigation
Startseite
English
Recherche
Erweiterte Suche
Durchblättern nach ...
... Zeitschriften
... Publikationstypen
... Forschungsdaten
... Erscheinungsjahr
Publikationen im Überblick
Hilfe & Kontakt
Ansprechpartner
Hilfe
Datenschutz
Heinrich, M.P.* ; Jenkinson, M.* ; Bhushan, M.* ; Matin, T.* ; Gleeson, F.V.* ; Brady, S.M.* ; Schnabel, J.A.*
MIND: Modality independent neighbourhood descriptor for multi-modal deformable registration.
Med. Image Anal.
16
, 1423-1435 (2012)
DOI
PMC
Open Access Green
möglich sobald Postprint bei der ZB eingereicht worden ist.
Abstract
Metriken
Zusatzinfos
Deformable registration of images obtained from different modalities remains a challenging task in medical image analysis. This paper addresses this important problem and proposes a modality independent neighbourhood descriptor (MIND) for both linear and deformable multi-modal registration. Based on the similarity of small image patches within one image, it aims to extract the distinctive structure in a local neighbourhood, which is preserved across modalities. The descriptor is based on the concept of image self-similarity, which has been introduced for non-local means filtering for image denoising. It is able to distinguish between different types of features such as corners, edges and homogeneously textured regions. MIND is robust to the most considerable differences between modalities: non-functional intensity relations, image noise and non-uniform bias fields. The multi-dimensional descriptor can be efficiently computed in a dense fashion across the whole image and provides point-wise local similarity across modalities based on the absolute or squared difference between descriptors, making it applicable for a wide range of transformation models and optimisation algorithms. We use the sum of squared differences of the MIND representations of the images as a similarity metric within a symmetric non-parametric Gauss-Newton registration framework. In principle, MIND would be applicable to the registration of arbitrary modalities. In this work, we apply and validate it for the registration of clinical 3D thoracic CT scans between inhale and exhale as well as the alignment of 3D CT and MRI scans. Experimental results show the advantages of MIND over state-of-the-art techniques such as conditional mutual information and entropy images, with respect to clinically annotated landmark locations. © 2012 Elsevier B.V.
Altmetric
Weitere Metriken?
[➜Einloggen]
Tags
Anmerkungen
Besondere Publikation
Zusatzinfos bearbeiten
[➜Einloggen]
Publikationstyp
Artikel: Journalartikel
Dokumenttyp
Wissenschaftlicher Artikel
Typ der Hochschulschrift
Herausgeber
Korrespondenzautor
Schlagwörter
Multi-modal Similarity Metric ; Non-local Means ; Non-rigid Registration ; Pulmonary Images ; Self-similarity
Keywords plus
ISSN (print) / ISBN
1361-8415
e-ISSN
1361-8415
ISBN
Bandtitel
Konferenztitel
Konferzenzdatum
Konferenzort
Konferenzband
Zeitschrift
Medical Image Analysis
Quellenangaben
Band: 16,
Heft: 7,
Seiten: 1423-1435
Artikelnummer: ,
Supplement: ,
Reihe
Verlag
Elsevier
Verlagsort
Hochschule
Hochschulort
Fakultät
Veröffentlichungsdatum
0000-00-00
Veröffentlichungsnummer
Anmeldedatum
0000-00-00
Anmelder/Inhaber
weitere Inhaber
Anmeldeland
Priorität
Nichtpatentliteratur
Publikationen
Begutachtungsstatus
Peer reviewed
Institut(e)
Institute for Machine Learning in Biomed Imaging (IML)
Förderungen
Copyright