Araştırma Makalesi
BibTex RIS Kaynak Göster

Cilt Kanseri Sınıflandırması için Transfer Öğrenme ve Görme Dönüştürücü Modellerinin Karşılaştırılması

Yıl 2025, Cilt: 15 Sayı: 2, 179 - 188, 31.12.2025

Öz

Son yıllarda derin öğrenme yöntemleri tıbbi görüntüleme alanında büyük ilerleme kaydetmiştir. Özellikle Evrişimsel Sinir Ağları ve Transformatör tabanlı modeller bu alanda yüksek başarı oranlarına ulaşarak önemli ilerlemeler kaydetmiştir. Bu çalışmada, beş farklı transfer öğrenme mimarisinin (DenseNet169, InceptionV3, MobileNetV2, VGG16 ve Xception) ve Vision Transformer (ViT) modellerinin cilt kanseri türlerinin sınıflandırılmasındaki performansı karşılaştırılmıştır. Çalışmada kullanılan veri kümesi Cilt Kanseri Veri Kümesidir. İlk aşamada ViT modeli %93,79 geri çağırma, %92,22 kesinlik, %93,00 F1-skoru ve %92,42 doğruluk ile tüm metriklerde en yüksek performansı göstermiştir. Diğer modeller, özellikle InceptionV3 ve MobileNetV2, yüksek geri çağırma oranları sergilemiş, ancak ViT'nin genel doğruluğuna ulaşamamıştır. Geliştirilmiş görüntülerle yapılan analizin ikinci aşamasında, orijinal görüntüler önce gri tonlamaya dönüştürülmüş, ardından eşikleme, Canny kenar algılama, genişletme ve erozyon uygulanmıştır. Bu görüntü işleme adımları görüntülerin kontrastını artırmış ve lezyon sınırlarını daha belirgin hale getirmiştir. Bu geliştirilmiş görüntülerle analiz edildiğinde, ViT yine en yüksek performansı elde etmiş ve geri çağırma (%95,49), hassasiyet (%94,17), F1-skoru (%94,83) ve doğruluk (%94,39) ile önemli ölçüde üstün olmuştur. ViT modeli, tıbbi görüntüleme uygulamalarında, özellikle karmaşık ve geliştirilmiş görüntülerde en yüksek doğruluk ve güvenilirliği sağlayan model olarak öne çıkmakta ve diğer modellere göre belirgin bir avantaj sunmaktadır. Ayrıca bu çalışma, görüntü işleme tekniklerinin derin öğrenme modellerinin doğruluğunu ve verimliliğini önemli ölçüde artırabileceğini açıkça ortaya koymaktadır. Bu bulgular, tıbbi görüntüleme alanında model seçimi yapılırken her bir yöntemin avantaj ve kısıtlamalarının dikkate alınması, uygulama gereksinimlerine ve veri setine göre en uygun modelin belirlenmesi gerektiğini vurgulamaktadır.

Kaynakça

  • [1] B. K. Armstrong and A. Kricker, "The epidemiology of UV induced skin cancer," J. Photochem. Photobiol. B: Biol., vol. 63, no. 1-3, pp. 8-18, 2001.
  • [2] International Agency for Research on Cancer, Working Group on the Evaluation of Carcinogenic Risks to Humans. Human papillomaviruses, vol. 90, 2011.
  • [3] L. Hogue and V. M. Harvey, "Basal cell carcinoma, squamous cell carcinoma, and cutaneous melanoma in skin of color patients," Dermatol. Clin., vol. 37, no. 4, pp. 519-526, 2019.
  • [4] C. Reggiani et al., "Update on non-invasive imaging techniques in early diagnosis of non-melanoma skin cancer," G Ital Dermatol Venereol, vol. 150, no. 4, pp. 393-405, 2015.
  • [5] A. Esteva et al., "Dermatologist-level classification of skin cancer with deep neural networks," Nature, vol. 542, no. 7639, pp. 115-118, 2017.
  • [6] H. A. Haenssle et al., "Man against machine: diagnostic performance of a deep learning convolutional neural network for dermoscopic melanoma recognition in comparison to 58 dermatologists," Ann. Oncol., vol. 29, no. 8, pp. 1836-1842, 2018.
  • [7] A. A. Adegun and S. Viriri, "Deep learning-based system for automatic melanoma detection," IEEE Access, vol. 8, pp. 7160-7172, 2019.
  • [8] R. T. Sutton et al., "An overview of clinical decision support systems: benefits, risks, and strategies for success," NPJ Digit. Med., vol. 3, no. 1, p. 17, 2020.
  • [9] Y. N. Fu’adah, N. C. Pratiwi, M. A. Pramudito, and N. Ibrahim, "Convolutional neural network (CNN) for automatic skin cancer classification system," in IOP Conf. Ser.: Mater. Sci. Eng., vol. 982, no. 1, p. 012005, Dec. 2020.
  • [10] N. Rezaoana, M. S. Hossain, and K. Andersson, "Detection and classification of skin cancer by using a parallel CNN model," in Proc. IEEE WIECON-ECE, pp. 380-386, Dec. 2020.
  • [11] T. Saba, M. A. Khan, A. Rehman, and S. L. Marie-Sainte, "Region extraction and classification of skin cancer: A heterogeneous framework of deep CNN features fusion and reduction," J. Med. Syst., vol. 43, no. 9, p. 289, 2019.
  • [12] W. O’Keefe et al., "A CNN approach for skin cancer classification," in Proc. Int. Conf. Inf. Technol. (ICIT), pp. 472-475, Jul. 2021.
  • [13] R. Garg, S. Maheshwari, and A. Shukla, "Decision support system for detection and classification of skin cancer using CNN," in Proc. ICICV 2020, pp. 578-586, Springer, 2021.
  • [14] M. S. Junayed, N. Anjum, A. Noman, and B. Islam, "A deep CNN model for skin cancer detection and classification," 2021.
  • [15] T. Tuncer et al., "A lightweight deep convolutional neural network model for skin cancer image classification," Appl. Soft Comput., p. 111794, 2024.
  • [16] D. Keerthana et al., "Hybrid convolutional neural networks with SVM classifier for classification of skin cancer," Biomed. Eng. Adv., vol. 5, p. 100069, 2023.
  • [17] L. I. Mampitiya, N. Rathnayake, and S. De Silva, "Efficient and low-cost skin cancer detection system implementation with a comparative study between traditional and CNN-based models," J. Comput. Cogn. Eng., vol. 2, no. 3, pp. 226-235, 2023.
  • [18] A. Hekler et al., "Superior skin cancer classification by the combination of human and artificial intelligence," Eur. J. Cancer, vol. 120, pp. 114-121, 2019.
  • [19] M. M. Musthafa et al., "Enhanced skin cancer diagnosis using optimized CNN architecture and checkpoints for automated dermatological lesion classification," BMC Med. Imaging, vol. 24, no. 1, p. 201, 2024.
  • [20] S. S. Chaturvedi, J. V. Tembhurne, and T. Diwan, "A multi-class skin cancer classification using deep convolutional neural networks," Multimed. Tools Appl., vol. 79, no. 39, pp. 28477-28498, 2020.
  • [21] H. Nahata and S. P. Singh, "Deep learning solutions for skin cancer detection and diagnosis," in Machine Learning with Healthcare Perspective, pp. 159-182, 2020.
  • [22] R. Tanna and T. Sharma, "Binary classification of melanoma skin cancer using SVM and CNN," in Proc. Int. Conf. AIMV, pp. 1-4, Sept. 2021.
  • [23] P. Sedigh, R. Sadeghian, and M. T. Masouleh, "Generating synthetic medical images by using GAN to improve CNN performance in skin cancer classification," in Proc. ICRoM, pp. 497-502, Nov. 2019.
  • [24] M. S. Ali et al., "An enhanced technique of skin cancer classification using deep convolutional neural network with transfer learning models," Mach. Learn. Appl., vol. 5, p. 100036, 2021.
  • [25] S. Mohapatra et al., "Skin cancer classification using convolution neural networks," in Proc. ICADCML 2020, pp. 433-442, Springer, 2020.
  • [26] G. Litjens et al., "A survey on deep learning in medical image analysis," Med. Image Anal., vol. 42, pp. 60-88, 2017.
  • [27] S. J. Pan and Q. Yang, "A survey on transfer learning," IEEE Trans. Knowl. Data Eng., vol. 22, no. 10, pp. 1345-1359, 2009.
  • [28] A. D. Alexey, "An image is worth 16x16 words: Transformers for image recognition at scale," arXiv preprint, arXiv:2010.11929, 2020.
  • [29] H. Touvron et al., "Training data-efficient image transformers & distillation through attention," in Proc. ICML, pp. 10347-10357, Jul. 2021.
  • [30] C. Fanconi, "Skin Cancer: Malignant vs. Benign, Processed Skin Cancer pictures of the ISIC Archive," [Online]. Available: https://www.kaggle.com/datasets/fanconic/skin-cancer-malignant-vs-benign. [Accessed: Dec. 15, 2024].
  • [31] Y. LeCun, Y. Bengio, and G. Hinton, "Deep learning," Nature, vol. 521, no. 7553, pp. 436-444, 2015.
  • [32] R. C. Gonzalez and R. E. Woods, Digital Image Processing, 2nd ed., Prentice Hall, 2002.
  • [33] M. Sonka, V. Hlavac, and R. Boyle, Image Processing, Analysis, and Machine Vision, Springer, 2013.
  • [34] G. Huang et al., "Densely connected convolutional networks," in Proc. CVPR, pp. 4700-4708, 2017.
  • [35] C. Szegedy et al., "Rethinking the inception architecture for computer vision," in Proc. CVPR, pp. 2818-2826, 2016.
  • [36] M. Sandler et al., "Mobilenetv2: Inverted residuals and linear bottlenecks," in Proc. CVPR, pp. 4510-4520, 2018.
  • [37] K. Simonyan and A. Zisserman, "Very deep convolutional networks for large-scale image recognition," arXiv preprint, arXiv:1409.1556, 2014.
  • [38] F. Chollet, "Xception: Deep learning with depthwise separable convolutions," in Proc. CVPR, pp. 1251-1258, 2017.
  • [39] M. Sokolova and G. Lapalme, "A systematic analysis of performance measures for classification tasks," Inf. Process. Manag., vol. 45, no. 4, pp. 427-437, 2009.
  • [40] K. Ali et al., "Multiclass skin cancer classification using EfficientNets–a first step towards preventing skin cancer," Neurosci. Inform., vol. 2, no. 4, p. 100034, 2022.
  • [41] C. Xin et al., "An improved transformer network for skin cancer classification," Comput. Biol. Med., vol. 149, p. 105939, 2022.
  • [42] S. Jinnai et al., "The development of a skin cancer classification system for pigmented skin lesions using deep learning," Biomolecules, vol. 10, no. 8, p. 1123, 2020.
  • [43] V. Anand et al., "An enhanced transfer learning based classification for diagnosis of skin cancer," Diagnostics, vol. 12, no. 7, p. 1628, 2022.

Comparative Analysis of Transfer Learning and Vision Transformer Models for Skin Cancer Classification Using Enhanced Dermoscopic Images

Yıl 2025, Cilt: 15 Sayı: 2, 179 - 188, 31.12.2025

Öz

In recent years, deep learning has achieved remarkable advancements in medical image analysis, particularly through Convolutional Neural Networks (CNNs) and Transformer-based architectures. This study aims to evaluate and compare the performance of five transfer learning models (DenseNet169, InceptionV3, MobileNetV2, VGG16 and Xception) and a Vision Transformer (ViT) model for the classification of skin cancer using the “Skin Cancer: Malignant vs. Benign” dataset .In the first phase, the ViT model achieved the highest overall performance with 93.79% recall, 92.22% precision, 93.00% F1-score and 92.42% accuracy. Although InceptionV3 and MobileNetV2 demonstrated strong recall values, they did not match the overall accuracy of ViT. In the second phase, image enhancement techniques—grayscale conversion, thresholding, Canny edge detection, dilation, and erosion were applied to emphasize lesion boundaries and improve contrast. Using these enhanced images, the ViT model again achieved the best performance, with 95.49% recall, 94.17% precision, 94.83% F1-score, and 94.39% accuracy. These results indicate that the ViT architecture provides superior accuracy and reliability in complex and enhanced medical images. Furthermore, the study demonstrates that incorporating image preprocessing techniques can significantly enhance the performance of deep learning models in medical imaging applications.

Kaynakça

  • [1] B. K. Armstrong and A. Kricker, "The epidemiology of UV induced skin cancer," J. Photochem. Photobiol. B: Biol., vol. 63, no. 1-3, pp. 8-18, 2001.
  • [2] International Agency for Research on Cancer, Working Group on the Evaluation of Carcinogenic Risks to Humans. Human papillomaviruses, vol. 90, 2011.
  • [3] L. Hogue and V. M. Harvey, "Basal cell carcinoma, squamous cell carcinoma, and cutaneous melanoma in skin of color patients," Dermatol. Clin., vol. 37, no. 4, pp. 519-526, 2019.
  • [4] C. Reggiani et al., "Update on non-invasive imaging techniques in early diagnosis of non-melanoma skin cancer," G Ital Dermatol Venereol, vol. 150, no. 4, pp. 393-405, 2015.
  • [5] A. Esteva et al., "Dermatologist-level classification of skin cancer with deep neural networks," Nature, vol. 542, no. 7639, pp. 115-118, 2017.
  • [6] H. A. Haenssle et al., "Man against machine: diagnostic performance of a deep learning convolutional neural network for dermoscopic melanoma recognition in comparison to 58 dermatologists," Ann. Oncol., vol. 29, no. 8, pp. 1836-1842, 2018.
  • [7] A. A. Adegun and S. Viriri, "Deep learning-based system for automatic melanoma detection," IEEE Access, vol. 8, pp. 7160-7172, 2019.
  • [8] R. T. Sutton et al., "An overview of clinical decision support systems: benefits, risks, and strategies for success," NPJ Digit. Med., vol. 3, no. 1, p. 17, 2020.
  • [9] Y. N. Fu’adah, N. C. Pratiwi, M. A. Pramudito, and N. Ibrahim, "Convolutional neural network (CNN) for automatic skin cancer classification system," in IOP Conf. Ser.: Mater. Sci. Eng., vol. 982, no. 1, p. 012005, Dec. 2020.
  • [10] N. Rezaoana, M. S. Hossain, and K. Andersson, "Detection and classification of skin cancer by using a parallel CNN model," in Proc. IEEE WIECON-ECE, pp. 380-386, Dec. 2020.
  • [11] T. Saba, M. A. Khan, A. Rehman, and S. L. Marie-Sainte, "Region extraction and classification of skin cancer: A heterogeneous framework of deep CNN features fusion and reduction," J. Med. Syst., vol. 43, no. 9, p. 289, 2019.
  • [12] W. O’Keefe et al., "A CNN approach for skin cancer classification," in Proc. Int. Conf. Inf. Technol. (ICIT), pp. 472-475, Jul. 2021.
  • [13] R. Garg, S. Maheshwari, and A. Shukla, "Decision support system for detection and classification of skin cancer using CNN," in Proc. ICICV 2020, pp. 578-586, Springer, 2021.
  • [14] M. S. Junayed, N. Anjum, A. Noman, and B. Islam, "A deep CNN model for skin cancer detection and classification," 2021.
  • [15] T. Tuncer et al., "A lightweight deep convolutional neural network model for skin cancer image classification," Appl. Soft Comput., p. 111794, 2024.
  • [16] D. Keerthana et al., "Hybrid convolutional neural networks with SVM classifier for classification of skin cancer," Biomed. Eng. Adv., vol. 5, p. 100069, 2023.
  • [17] L. I. Mampitiya, N. Rathnayake, and S. De Silva, "Efficient and low-cost skin cancer detection system implementation with a comparative study between traditional and CNN-based models," J. Comput. Cogn. Eng., vol. 2, no. 3, pp. 226-235, 2023.
  • [18] A. Hekler et al., "Superior skin cancer classification by the combination of human and artificial intelligence," Eur. J. Cancer, vol. 120, pp. 114-121, 2019.
  • [19] M. M. Musthafa et al., "Enhanced skin cancer diagnosis using optimized CNN architecture and checkpoints for automated dermatological lesion classification," BMC Med. Imaging, vol. 24, no. 1, p. 201, 2024.
  • [20] S. S. Chaturvedi, J. V. Tembhurne, and T. Diwan, "A multi-class skin cancer classification using deep convolutional neural networks," Multimed. Tools Appl., vol. 79, no. 39, pp. 28477-28498, 2020.
  • [21] H. Nahata and S. P. Singh, "Deep learning solutions for skin cancer detection and diagnosis," in Machine Learning with Healthcare Perspective, pp. 159-182, 2020.
  • [22] R. Tanna and T. Sharma, "Binary classification of melanoma skin cancer using SVM and CNN," in Proc. Int. Conf. AIMV, pp. 1-4, Sept. 2021.
  • [23] P. Sedigh, R. Sadeghian, and M. T. Masouleh, "Generating synthetic medical images by using GAN to improve CNN performance in skin cancer classification," in Proc. ICRoM, pp. 497-502, Nov. 2019.
  • [24] M. S. Ali et al., "An enhanced technique of skin cancer classification using deep convolutional neural network with transfer learning models," Mach. Learn. Appl., vol. 5, p. 100036, 2021.
  • [25] S. Mohapatra et al., "Skin cancer classification using convolution neural networks," in Proc. ICADCML 2020, pp. 433-442, Springer, 2020.
  • [26] G. Litjens et al., "A survey on deep learning in medical image analysis," Med. Image Anal., vol. 42, pp. 60-88, 2017.
  • [27] S. J. Pan and Q. Yang, "A survey on transfer learning," IEEE Trans. Knowl. Data Eng., vol. 22, no. 10, pp. 1345-1359, 2009.
  • [28] A. D. Alexey, "An image is worth 16x16 words: Transformers for image recognition at scale," arXiv preprint, arXiv:2010.11929, 2020.
  • [29] H. Touvron et al., "Training data-efficient image transformers & distillation through attention," in Proc. ICML, pp. 10347-10357, Jul. 2021.
  • [30] C. Fanconi, "Skin Cancer: Malignant vs. Benign, Processed Skin Cancer pictures of the ISIC Archive," [Online]. Available: https://www.kaggle.com/datasets/fanconic/skin-cancer-malignant-vs-benign. [Accessed: Dec. 15, 2024].
  • [31] Y. LeCun, Y. Bengio, and G. Hinton, "Deep learning," Nature, vol. 521, no. 7553, pp. 436-444, 2015.
  • [32] R. C. Gonzalez and R. E. Woods, Digital Image Processing, 2nd ed., Prentice Hall, 2002.
  • [33] M. Sonka, V. Hlavac, and R. Boyle, Image Processing, Analysis, and Machine Vision, Springer, 2013.
  • [34] G. Huang et al., "Densely connected convolutional networks," in Proc. CVPR, pp. 4700-4708, 2017.
  • [35] C. Szegedy et al., "Rethinking the inception architecture for computer vision," in Proc. CVPR, pp. 2818-2826, 2016.
  • [36] M. Sandler et al., "Mobilenetv2: Inverted residuals and linear bottlenecks," in Proc. CVPR, pp. 4510-4520, 2018.
  • [37] K. Simonyan and A. Zisserman, "Very deep convolutional networks for large-scale image recognition," arXiv preprint, arXiv:1409.1556, 2014.
  • [38] F. Chollet, "Xception: Deep learning with depthwise separable convolutions," in Proc. CVPR, pp. 1251-1258, 2017.
  • [39] M. Sokolova and G. Lapalme, "A systematic analysis of performance measures for classification tasks," Inf. Process. Manag., vol. 45, no. 4, pp. 427-437, 2009.
  • [40] K. Ali et al., "Multiclass skin cancer classification using EfficientNets–a first step towards preventing skin cancer," Neurosci. Inform., vol. 2, no. 4, p. 100034, 2022.
  • [41] C. Xin et al., "An improved transformer network for skin cancer classification," Comput. Biol. Med., vol. 149, p. 105939, 2022.
  • [42] S. Jinnai et al., "The development of a skin cancer classification system for pigmented skin lesions using deep learning," Biomolecules, vol. 10, no. 8, p. 1123, 2020.
  • [43] V. Anand et al., "An enhanced transfer learning based classification for diagnosis of skin cancer," Diagnostics, vol. 12, no. 7, p. 1628, 2022.
Toplam 43 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Yazılım Mühendisliği (Diğer)
Bölüm Araştırma Makalesi
Yazarlar

Yasin Özkan 0000-0002-2029-0856

Gönderilme Tarihi 28 Mayıs 2025
Kabul Tarihi 23 Aralık 2025
Yayımlanma Tarihi 31 Aralık 2025
Yayımlandığı Sayı Yıl 2025 Cilt: 15 Sayı: 2

Kaynak Göster

APA Özkan, Y. (2025). Comparative Analysis of Transfer Learning and Vision Transformer Models for Skin Cancer Classification Using Enhanced Dermoscopic Images. European Journal of Technique (EJT), 15(2), 179-188. https://doi.org/10.36222/ejt.1708219