Research Article
BibTex RIS Cite

Improving Breast Cancer Diagnosis using Attention-Enhanced Hybrid CNN–Transformer Model

Year 2025, Volume: 14 Issue: 4, 2616 - 2638, 31.12.2025
https://doi.org/10.17798/bitlisfen.1772185

Abstract

Breast cancer is the most common cancer among women and the most frequently diagnosed cancer worldwide. Recent advancements in deep learning have led to significant improvements in tumor detection from breast ultrasound (BUSI) images, enhancing the diagnostic accuracy of breast cancer screening. Although deep convolutional neural networks (CNNs) and transformer-based architectures have individually yielded promising results, challenges such as low contrast, spatial variability, and irregular tumor shapes continue to hinder the robustness of current methods. Therefore, in this study, a novel hybrid CNN–Transformer framework is proposed to improve discriminative feature extraction for BUSI cancer analysis. The network employs a dual-branch architecture, integrating features extracted from both CNN and transformer models. In the first branch, the Swin Transformer is combined with a Triplet Attention to strengthen its ability to learn long-range dependencies and global contextual information. The Triple Attention module processes feature maps along three orthogonal axes, enabling a more effective representation of both spatial and channel-level relationships. The second branch incorporates the Efficient Net architecture augmented with an Efficient Channel Attention (ECA) module, which facilitates adaptive channel-level feature recalibration. This design allows the model to emphasize diagnostically salient regions within ultrasound images. High-level features from both branches are fused for final classification. Experimental results on the BUSI dataset demonstrate that the proposed architecture achieves superior performance, with 97.4% accuracy, 97.9% precision, 97.9% sensitivity, and a 97.9% F1-score. These outcomes confirm the effectiveness of the proposed hybrid CNN–Transformer design in improving automated breast cancer diagnosis using ultrasound imaging.

Ethical Statement

The study is complied with research and publication ethics.

References

  • F. Bray et al., “Global cancer statistics 2022: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries,” CA. Cancer J. Clin., vol. 74, no. 3, pp. 229–263, 2024.
  • H. A. Geuzinge, E. A. M. Heijnsdijk, I.-M. Obdeijn, H. J. de Koning, M. M. A. Tilanus-Linthorst, and F. S. Group, “Experiences, expectations and preferences regarding MRI and mammography as breast cancer screening tools in women at familial risk,” The Breast, vol. 56, pp. 1–6, 2021.
  • Y. Wang et al., “Comparison of ultrasound and mammography for early diagnosis of breast cancer among Chinese women with suspected breast lesions: A prospective trial,” Thorac. cancer, vol. 13, no. 22, pp. 3145–3151, 2022.
  • H. Yu et al., “A Foundational Generative Model for Breast Ultrasound Image Analysis,” arXiv Prepr. arXiv2501.06869, 2025.
  • A. Mehmood, Y. Hu, and S. H. Khan, “A Novel Channel Boosted Residual CNN-Transformer with Regional-Boundary Learning for Breast Cancer Detection,” arXiv Prepr. arXiv2503.15008, 2025.
  • J. Liu et al., “Speckle noise reduction for medical ultrasound images based on cycle-consistent generative adversarial network,” Biomed. Signal Process. Control, vol. 86, p. 105150, 2023.
  • S. Degadwala and D. Vyas, “A review on machine learning and deep learning methods on medical image classification,” Int. J. Sci. Res. Comput. Sci. Eng. Inf. Technol., vol. 10, no. 3, pp. 546–555, 2024.
  • A. Vaswani et al., “Attention is all you need,” Adv. Neural Inf. Process. Syst., vol. 30, 2017.
  • Q. He, Q. Yang, and M. Xie, “HCTNet: A hybrid CNN-transformer network for breast ultrasound image segmentation,” Comput. Biol. Med., vol. 155, p. 106629, 2023.
  • H. Zhang et al., “HAU-Net: Hybrid CNN-transformer for breast ultrasound image segmentation,” Biomed. Signal Process. Control, vol. 87, p. 105427, 2024.
  • Y. Haruna, S. Qin, A. H. A. Chukkol, A. A. Yusuf, I. Bello, and A. Lawan, “Exploring the synergies of hybrid convolutional neural network and Vision Transformer architectures for computer vision: A survey,” Eng. Appl. Artif. Intell., vol. 144, p. 110057, 2025.
  • S. Bourouis, S. S. Band, A. Mosavi, S. Agrawal, and M. Hamdi, “Meta-heuristic algorithm-tuned neural network for breast cancer diagnosis using ultrasound images,” Front. Oncol., vol. 12, p. 834028, 2022.
  • A. K. Mishra, P. Roy, S. Bandyopadhyay, and S. K. Das, “Breast ultrasound tumour classification: A Machine Learning—Radiomics based approach,” Expert Syst., vol. 38, no. 7, p. e12713, 2021.
  • M. Wei et al., “A benign and malignant breast tumor classification method via efficiently combining texture and morphological features on ultrasound images,” Comput. Math. Methods Med., vol. 2020, no. 1, p. 5894010, 2020.
  • M. D. Ali et al., “Breast cancer classification through meta-learning ensemble technique using convolution neural networks,” Diagnostics, vol. 13, no. 13, p. 2242, 2023.
  • M. F. Dar and A. Ganivada, “Deep learning and genetic algorithm-based ensemble model for feature selection and classification of breast ultrasound images,” Image Vis. Comput., vol. 146, p. 105018, 2024.
  • J. Ru, Z. Zhu, and J. Shi, “Spatial and geometric learning for classification of breast tumors from multi-center ultrasound images: a hybrid learning approach,” BMC Med. Imaging, vol. 24, no. 1, p. 133, 2024.
  • C. Kormpos, F. Zantalis, S. Katsoulis, and G. Koulouras, “Evaluating Deep Learning Architectures for Breast Tumor Classification and Ultrasound Image Detection Using Transfer Learning,” Big Data Cogn. Comput., vol. 9, no. 5, p. 111, 2025.
  • G. Işık and İ. Paçal, “Few-shot classification of ultrasound breast cancer images using meta-learning algorithms,” Neural Comput. Appl., vol. 36, no. 20, pp. 12047–12059, 2024.
  • A. Nasiri-Sarvi, M. S. Hosseini, and H. Rivaz, “Vision mamba for classification of breast ultrasound images,” in Deep Breast Workshop on AI and Imaging for Diagnostic and Treatment Challenges in Breast Care, 2024, pp. 148–158.
  • M. Saini, S. Hassanzadeh, B. Musa, M. Fatemi, and A. Alizad, “Variational mode directed deep learning framework for breast lesion classification using ultrasound imaging,” Sci. Rep., vol. 15, no. 1, p. 14300, 2025.
  • M. R. Alom et al., “An explainable AI-driven deep neural network for accurate breast cancer detection from histopathological and ultrasound images,” Sci. Rep., vol. 15, no. 1, pp. 1–34, 2025.
  • S. A. Chelloug, A. S. B. Mahel, R. Alnashwan, A. Rafiq, M. S. A. Muthanna, and A. Aziz, “Enhanced breast cancer diagnosis using modified InceptionNet-V3: a deep learning approach for ultrasound image classification,” Front. Physiol., vol. 16, p. 1558001, 2025.
  • H. A. Helaly, M. Badawy, E. M. El-Gendy, and A. Y. Haikal, “Early breast cancer detection, affected cell classification, and segmentation framework,” Eng. Appl. Artif. Intell., vol. 162, p. 112598, 2025.
  • M. Abbadi, Y. Himeur, S. Atalla, and W. Mansoor, “Interpretable deep transfer learning for breast ultrasound cancer detection: A multi-dataset study,” arXiv Prepr. arXiv2509.05004, 2025.
  • N. Muzoglu, “Breast Cancer Classification in Ultrasound Imaging Using Cost-Sensitive Learning and K-Means SMOTE on the Imbalanced BUSI Dataset with Deep Feature Extraction,” Bitlis Eren Üniversitesi Fen Bilim. Derg., vol. 14, no. 2, pp. 755–776.
  • I. Nissar, S. Alam, and S. Masood, “SwinEff-AttentionNet: a dual hybrid model for breast image segmentation and classification using multiple ultrasound modality,” Biomed. Signal Process. Control, vol. 112, p. 108795, 2026.
  • Z. Liu et al., “Swin transformer: Hierarchical vision transformer using shifted windows,” in Proceedings of the IEEE/CVF international conference on computer vision, 2021, pp. 10012–10022.
  • D. Misra, T. Nalamada, A. U. Arasanipalai, and Q. Hou, “Rotate to attend: Convolutional triplet attention module,” in Proceedings of the IEEE/CVF winter conference on applications of computer vision, 2021, pp. 3139–3148.
  • M. Tan and Q. Le, “Efficientnet: Rethinking model scaling for convolutional neural networks,” in International conference on machine learning, 2019, pp. 6105–6114.
  • Q. Wang, B. Wu, P. Zhu, P. Li, W. Zuo, and Q. Hu, “ECA-Net: Efficient channel attention for deep convolutional neural networks,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2020, pp. 11534–11542.
  • W. Al-Dhabyani, M. Gomaa, H. Khaled, and A. Fahmy, “Dataset of breast ultrasound images,” Data Br., vol. 28, p. 104863, 2020.
  • F. Taheri and K. Rahbar, “Improving breast cancer classification in fine-grain ultrasound images through feature discrimination and a transfer learning approach,” Biomed. Signal Process. Control, vol. 106, p. 107690, 2025.
There are 33 citations in total.

Details

Primary Language English
Subjects Artificial Intelligence (Other), Signal Processing
Journal Section Research Article
Authors

Aslı Nur Polat 0000-0001-6262-4968

Hussein Mahmood Abdo Mohammed 0000-0001-5491-5669

Submission Date August 25, 2025
Acceptance Date December 15, 2025
Publication Date December 31, 2025
Published in Issue Year 2025 Volume: 14 Issue: 4

Cite

IEEE [1]A. N. Polat and H. M. A. Mohammed, “Improving Breast Cancer Diagnosis using Attention-Enhanced Hybrid CNN–Transformer Model”, Bitlis Eren Üniversitesi Fen Bilimleri Dergisi, vol. 14, no. 4, pp. 2616–2638, Dec. 2025, doi: 10.17798/bitlisfen.1772185.

Bitlis Eren University
Journal of Science Editor
Bitlis Eren University Graduate Institute
Bes Minare Mah. Ahmet Eren Bulvari, Merkez Kampus, 13000 BITLIS