Research Article
BibTex RIS Cite

Comparative analysis of the effect of color space transformations on transformer-based skin cancer classification

Year 2026, Issue: 064 , 15 - 28 , 30.03.2026
https://doi.org/10.59313/jsr-a.1885019
https://izlik.org/JA65UM54XM

Abstract

Skin cancer is the most common type of cancer, a life-threatening condition that leads to serious health problems if not detected early, and its incidence is increasing worldwide. In recent years, computer vision and decision support systems have been used for disease detection in dermoscopic images. Furthermore, it has been observed that data representation methods affect the detection performance of these models, and the effect of color information on transformer-based models has not been sufficiently investigated. This study used the International Skin Imaging Collaboration (ISIC) 2017 dataset consisting of RGB images, and these images were converted into the HSV, LAB, and YCbCr color spaces. Transformer-based models, including visual transformer (ViT), swin transformer, data efficient image transformer (DeiT), and label-free self-distillation (DINO), were used for benign and malignant classification. According to the classification performance results, RGB and HSV color spaces particularly in the DeiT and Swin models, stable and high accuracy values ​​were obtained. It was observed that the ViT and DINO models were more sensitive to color space transformations and achieved lower classification performance compared to other models. The highest performance in skin cancer classification was achieved with the DeiT model trained in the RGB color space, with the highest accuracy (0.7668). Furthermore, the explainability-based gradient-weighted class activation mapping (Grad-CAM) method was used to analyze where the models focused in image regions when making classification decisions. This study shows the effect and usability of color space transformations in transformer-based models for skin cancer classification and offers a comparative contribution to the literature.

References

  • [1] G. M. S. Himel, M. M. Islam, K. A. Al-Aff, S. I. Karim, and M. K. U. Sikder, “Skin cancer segmentation and classification using vision transformer for automatic analysis in dermatoscopy-based noninvasive digital system,” Int. J. Biomed. Imaging, vol. 2024, pp. 3022192, 2024.
  • [2] N. C. F. Codella et al., “Skin lesion analysis toward melanoma detection: A challenge at the 2017 International symposium on biomedical imaging (ISBI), hosted by the international skin imaging collaboration (ISIC),” in 2018 IEEE 15th International Symposium on Biomedical Imaging (ISBI 2018), 2018.
  • [3] G. H. Dagnaw, M. El Mouhtadi, and M. Mustapha, “Skin cancer classification using vision transformers and explainable artificial intelligence,” J. Med. Artif. Intell., vol. 7, 2024.
  • [4] M. Arshad et al., “A computer-aided diagnosis system using deep learning for multiclass skin lesion classification,” Comput. Intell. Neurosci., vol. 2021, no. 1, 2021.
  • [5] S. Khan, H. Ali, and Z. Shah, “Identifying the role of vision transformer for skin cancer-A scoping review,” Front. Artif. Intell., vol. 6, 2023.
  • [6] M. A. Khan, T. Akram, Y.-D. Zhang, and M. Sharif, “Attributes based skin lesion detection and recognition: A mask RCNN and transfer learning-based deep learning framework,” Pattern Recognit. Lett., vol. 143, pp. 58–66, 2021.
  • [7] A. Murugan, S. A. H. Nair, and K. P. S. Kumar, “Detection of skin cancer using SVM, Random Forest and kNN classifiers,” J. Med. Syst., vol. 43, no. 8, 2019.
  • [8] R. Kaur, H. G. Hosseini, and R. Sinha, “Lesion border detection of skin cancer images using deep fully convolutional neural network with customized weights,” Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., vol. 2021, pp. 3035–3038, 2021.
  • [9] Subramanian., M. A. Ala Walid, D. Sarada Prasanna Mallick, R. Rastogi, A. Chauhan, and A. Vidya, “Melanoma Skin Cancer Detection using a CNN-Regularized Extreme Learning Machine (RELM) based Model,” in 2023 Second International Conference on Electronics and Renewable Systems (ICEARS), 2023.
  • [10] Y. S. Alsahafi, M. A. Kassem, and K. M. Hosny, “Skin-Net: a novel deep residual network for skin lesions classification using multilevel feature extraction and cross-channel correlation with detection of outlier,” J. Big Data, vol. 10, no. 1, 2023.
  • [11] A. Esteva et al., “Corrigendum: Dermatologist-level classification of skin cancer with deep neural networks,” Nature, vol. 546, no. 7660, 2017.
  • [12] A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszkoreit, and N. Houlsby, “An image is worth 16×16 words: Transformers for image recognition at scale,” 2020, arXiv:2010.11929.
  • [13] H. Alquran et al., “The melanoma skin cancer detection and classification using support vector machine,” in 2017 IEEE Jordan Conference on Applied Electrical Engineering and Computing Technologies (AEECT), 2017.
  • [14] D. Mamadou, K. J. Ayikpa, A. B. Ballo, and B. M. Kouassi, “Analysis of the impact of color spaces on skin cancer diagnosis using deep learning techniques,” Rev. D Intell. Artif., vol. 37, no. 6, pp. 1377–1385, 2023.
  • [15] R. Velastegui and M. Pedersen, “The impact of using different color spaces in histological image classification using convolutional neural networks,” in 2021 9th European Workshop on Visual Information Processing (EUVIP), 2021.
  • [16] D. S. Luz, T. J. B. Lima, R. R. V. Silva, D. M. V. Magalhães, and F. H. D. Araujo, “Automatic detection metastasis in breast histopathological images based on ensemble learning and color adjustment,” Biomed. Signal Process. Control, vol. 75, no. 103564, 2022.
  • [17] “ISIC Challenge,” Isic-archive.com. [Online]. Available: https://challenge.isic-archive.com/landing/2017/. [Accessed: 22-Jan-2026].
  • [18] R. C. Gonzalez and R. E. Woods, Digital Image Processing, 4th ed. Upper Saddle River, NJ: Pearson, 2017.
  • [19] A. Kamboj, R. Rani, and N. A. Shalu, “Automated melanoma skin cancer detection from digital images,” Int. J. Biomed. Eng. Technol., vol. 37, no. 3, 2021.
  • [20] L. Singh, R. R. Janghel, and S. P. Sahu, “A hybrid feature fusion strategy for early fusion and majority voting for late fusion towards melanocytic skin lesion detection,” Int. J. Imaging Syst. Technol., vol. 32, no. 4, pp. 1231–1250, 2022.
  • [21] H.-W. Huang and Y. Zhang, “Flame colour characterization in the visible and infrared spectrum using a digital camera and image processing,” Meas. Sci. Technol., vol. 19, no. 8, 2008.
  • [22] M. E. Celebi et al., “A methodological approach to the classification of dermoscopy images,” Comput. Med. Imaging Graph., vol. 31, no. 6, pp. 362–373, 2007.
  • [23] Digital Video and HD: Algorithms and Interfaces, 2nd ed. Oxford, England: Morgan Kaufmann, 2012.
  • [24] S. Oukil, R. Kasmi, K. Mokrani, and B. García-Zapirain, “Automatic segmentation and melanoma detection based on color and texture features in dermoscopic images,” Skin Res. Technol., vol. 28, no. 2, pp. 203–211, 2022.
  • [25] Z. Liu et al., “Swin transformer: Hierarchical vision transformer using shifted windows,” in 2021 IEEE/CVF International Conference on Computer Vision (ICCV), 2021.
  • [26] H. Touvron, M. Cord, M. Douze, F. Massa, A. Sablayrolles, and H. Jégou, “Training data-efficient image transformers & distillation through attention,” arXiv [cs.CV], 2020.
  • [27] M. Caron et al., “Emerging properties in self-supervised vision transformers,” in 2021 IEEE/CVF International Conference on Computer Vision (ICCV), 2021.
  • [28] M. Sokolova and G. Lapalme, “A systematic analysis of performance measures for classification tasks,” Inf. Process. Manag., vol. 45, no. 4, pp. 427–437, 2009.
  • [29] R. R. Selvaraju, M. Cogswell, A. Das, R. Vedantam, D. Parikh, and D. Batra, “Grad-CAM: Visual explanations from deep networks via gradient-based localization,” Int. J. Comput. Vis., vol. 128, no. 2, pp. 336–359, 2020.
There are 29 citations in total.

Details

Primary Language English
Subjects Deep Learning, Electrical Engineering (Other)
Journal Section Research Article
Authors

Feyza Yılmaz 0000-0002-6989-2823

Submission Date February 9, 2026
Acceptance Date March 10, 2026
Publication Date March 30, 2026
DOI https://doi.org/10.59313/jsr-a.1885019
IZ https://izlik.org/JA65UM54XM
Published in Issue Year 2026 Issue: 064

Cite

IEEE [1]F. Yılmaz, “Comparative analysis of the effect of color space transformations on transformer-based skin cancer classification”, JSR-A, no. 064, pp. 15–28, Mar. 2026, doi: 10.59313/jsr-a.1885019.