多层级深度特征融合的乳腺癌病理图像分类
DOI:
作者:
作者单位:

1.大连民族大学;2.大连理工大学

作者简介:

通讯作者:

中图分类号:

TP391

基金项目:

国家自然科学基金;国家民委中青年英才培养计划项目;辽宁省应用基础研究计划


Multi-level deep feature fusion for breast cancer histopathology image classification
Author:
Affiliation:

1.Dalian Minzu University;2.Dalian University of Technology

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对既有深度学习方法在乳腺癌病理图像全局与局部特征统一表达上的不足,将长距离建模的Transformer和强局部感知的卷积神经网络(CNN)相结合,提出一种多层级深度特征融合的乳腺癌病理图像分类方法。该方法以双分支并行的Deit-B和ResNet-18模型作为骨干架构,在双分支网络中间层和末端位置分别引入特征融合操作,有效加强了乳腺癌病理图像全局与局部深度特征的联合学习;此外,在CNN支流的残差模块间引入密集连接操作来提升中间层融合特征的信息传递。通过全局-局部特征提取与支流内-支流间特征交互,可更有效捕获用于乳腺癌病理图像分类的判别特征。在乳腺癌病理图像公共数据集BreakHis上的消融实验与对比实验结果证明所提出方法的有效性,此外可获得99.83%的最优分类精度结果。

    Abstract:

    Aiming at the deficiency of the existing deep learning methods in the unified expression of global and local features of breast cancer pathological images, a breast cancer histopathology image classification method based on multi-level deep feature fusion is proposed by combining the long-distance modeling Transformer and strong local perception convolutional neural networks (CNN). This method uses the dual-branch parallel Deit-B and ResNet-18 model as the backbone architecture, and introduces the feature fusion operations in the middle layer and end position of the dual-branch network respectively, which effectively strengthens the joint learning of global and local deep features of breast cancer pathological images. In addition, dense connection operations are introduced between the residual modules of CNN tributaries to improve the information transmission of intermediate layer fusion features. Through global-local feature extraction and feature interaction within and between tributaries, discriminative features for breast cancer pathological image classification can be captured more effectively. The ablation experiment and comparative experimental results on the public dataset of breast cancer pathological images BreakHis prove the effectiveness of the proposed method, and the optimal classification accuracy of 99.83% can be obtained.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2023-08-11
  • 最后修改日期:2023-08-11
  • 录用日期:2023-11-07
  • 在线发布日期:
  • 出版日期: