[关键词]
[摘要]
脑肿瘤核磁共振成像(magnetic resonance imaging,MRI) 分割是脑肿瘤诊断和治疗的重要环节,针对U-Net网络结构对图像特征感受野大小有所限制、上下文信息存在鸿沟导致的分割准确率较低的问题,本文提出了一种融合多尺度特征的脑肿瘤MRI分割算法。首先,设计了一种多尺度聚合模块(multi-scale aggregation module,MAM)来替换原始U-Net网络中的常规卷积层,增加网络的深度以及宽度,来捕获特征图的边界细节信息。其次,在跳跃连接中用上下文空洞空间金字塔模块(context atrous spatial pyramid,CASP)代替直接拼接操作,扩大网络的感受野,增强对不同尺度大小的病灶的提取能力。最后,在U型的底部设计了一种多层次聚合注意力模块(multi-level aggregation attention,MAA),使网络模型关注图像分割区域有效特征,排除背景噪声。将改进算法在癌症基因组图谱(脑肿瘤数据)数据库(the Cancer Genome Atlas,TCGA) 上进行实验验证,其结果表明所提算法的平均交 并比(mean intersection over union,mIoU) 、Dice系数、敏感性、特异性、准确率等指标分别为:91.39%、92.81%、89.14%、99.95%、95.78%。
[Key word]
[Abstract]
Brain tumor magnetic resonance imaging (MRI) segmentation is an important step in the diagnosis and treatment of brain tumors. In this paper,a brain tumor MRI segmentation algorithm that integrates multi-scale features is proposed to address the low segmentation accuracy caused by the limited receptive field size of the U-Net network structure and the gap in contextual information.Firstly,a multi-scale aggregation module (MAM) is designed to replace the conventional convolutional layers in the original U-Net network. This increases the depth and width of the network to capture detailed boundary information of the feature maps.Secondly,the context atrous spatial pyramid module (CASP) is used in the skip connection instead of direct concatenation operation. This expands the network's receptive field and enhances the extraction ability of lesions at different scale sizes.Finally,a multi-level aggregation attention module (MAA) is designed at the bottom of the U-shaped network.This module enables the network model to focus on effective features in the image segmentation region and exclude background noise.The improved algorithm is experimentally validated on the Cancer Genome Atlas (TCGA) (brain tumor data) database.The results demonstrate that the proposed algorithm achieves the following metrics:mean intersection over union (mIoU) of 91.39%,Dice coefficient of 92.81%,sensitivity of 89.14%,specificity of 99.95%,and accuracy of 95.78%.
[中图分类号]
TP394.41
[基金项目]
国家重点实验室开放基金课题(2021SKLKF11) 资助项目