Abstract:Aiming at the deficiency of the existing deep learning methods in the unified expression of global and local features of breast cancer pathological images, a breast cancer histopathology image classification method based on multi-level deep feature fusion is proposed by combining the long-distance modeling Transformer and strong local perception convolutional neural networks (CNN). This method uses the dual-branch parallel Deit-B and ResNet-18 model as the backbone architecture, and introduces the feature fusion operations in the middle layer and end position of the dual-branch network respectively, which effectively strengthens the joint learning of global and local deep features of breast cancer pathological images. In addition, dense connection operations are introduced between the residual modules of CNN tributaries to improve the information transmission of intermediate layer fusion features. Through global-local feature extraction and feature interaction within and between tributaries, discriminative features for breast cancer pathological image classification can be captured more effectively. The ablation experiment and comparative experimental results on the public dataset of breast cancer pathological images BreakHis prove the effectiveness of the proposed method, and the optimal classification accuracy of 99.83% can be obtained.