Multimodal Brain Image Segmentation and Analysis with Neuromorphic Attention-Based Learning

Autor: Woo-Sup Han, Il Song Han
Rok vydání: 2020
Předmět:
Zdroj: Brainlesion: Glioma, Multiple Sclerosis, Stroke and Traumatic Brain Injuries ISBN: 9783030466428
BrainLes@MICCAI (2)
DOI: 10.1007/978-3-030-46643-5_2
Popis: Automated image analysis of brain tumors from 3D Magnetic Resonance Imaging (MRI) is necessary for the diagnosis and treatment planning of the disease, because manual practices of segmenting tumors are time consuming, expensive and can be subject to clinician diagnostic error. We propose a novel neuromorphic attention-based learner (NABL) model to train the deep neural network for tumor segmentation, which is with challenges of typically small datasets and the difficulty of exact segmentation class determination. The core idea is to introduce the neuromorphic attention to guide the learning process of deep neural network architecture, providing the highlighted region of interest for tumor segmentation. The neuromorphic convolution filters mimicking visual cortex neurons are adopted for the neuromorphic attention generation, transferred from the pre-trained neuromorphic convolutional neural networks(CNNs) for adversarial imagery environments. Our pre-trained neuromorphic CNN has the feature extraction ability applicable to brain MRI data, verified by the overall survival prediction without the tumor segmentation training at Brain Tumor Segmentation (BraTS) Challenge 2018. NABL provides us with an affordable solution of more accurate and faster image analysis of brain tumor segmentation, by incorporating the typical encoder-decoder U-net architecture of CNN. Experiment results illustrated the effectiveness and feasibility of our proposed method with flexible requirements of clinical diagnostic decision data, from segmentation to overall survival prediction. The overall survival prediction accuracy is 55% for predicting overall survival period in days, based on the BraTS 2019 validation dataset, while 48.6% based on the BraTS 2019 test dataset.
Databáze: OpenAIRE