A Swin transformer and residual network combined model for breast cancer disease multi-classification using histopathological images
PDF

Keywords

Breast cancer pathological image
Swin transformer
ResNet101
Focal loss

How to Cite

ZHUANG, J., WU, X., MENG, D., & JING, S. (2024). A Swin transformer and residual network combined model for breast cancer disease multi-classification using histopathological images. Instrumentation, 11(1), 112–120. https://doi.org/10.15878/j.instr.202400058

Abstract

Breast cancer has become a killer of women's health nowadays. In recent years, deep learning technology represented by convolutional neural network has shown great potential in the field of computer aided diagnosis of breast cancer. For a better accuracy, a breast cancer multi-classification model combining Swint ransformer network (SwinT) and residual network (ResNet101) has been proposed, which makes full use of SwinT's global context information modeling ability and ResNet101's local feature extraction ability, and additionally the cross entropy loss function is replaced by the focus loss function to solve the problem of unbalanced allocation of breast cancer data sets. The eight classification recognition accuracy of the proposed fusion model on the four amplification factors of breast cancer pathological images can reach from 96.1% to 97.5%. Compared with a single SwinT model and ResNet101 model, the fusion model has higher accuracy and better generalization ability, which provides a more effective method for screening, diagnosis and pathological classification of female breast cancer.
https://doi.org/10.15878/j.instr.202400058
PDF
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright (c) 2024 Xiaohui WU, Jianjun ZHUANG, Dongdong MENG, Shenghua JING

Downloads

Download data is not yet available.