Research Article
BibTex RIS Cite

Year 2025, Early View, 1 - 1
https://doi.org/10.35378/gujs.1549629

Abstract

References

  • [1] https://www.skincancer.org/skin-cancer-information/skin-cancer-facts/. Access date: 26.02.2025.
  • [2] Parkin, D.M., Mesher, D., and Sasieni, P., “13. Cancers attributable to solar (ultraviolet) radiation exposure in the UK in 2010”, British Journal of Cancer, 105(2): 66-9, (2011).
  • [3] Filimon, D.M. and Albu, A., “Skin diseases diagnosis using artificial neural networks”, 9th IEEE International Symposium on Applied Computational Intelligence and Informatics, Timisoara, Romania, 189-194, (2014).
  • [4] Hasan, M.K., Ahamad, M.A., Yap, C.H., and Yang, G., “A survey, review, and future trends of skin lesion segmentation and classification”, Computers in Biology and Medicine, 155(106624): 106624, (2023).
  • [5] Matthews, N.H., Li, W.Q., Qureshi, A.A., Weinstock, M.A., Cho, E., Cutaneous Melanoma: Etiology and Therapy, Codon Publications, Brisbane (AU), 3-22, (2017).
  • [6] https://www.cancer.org/research/cancer-facts-statistics/all-cancer-facts-figures/cancer-facts-figures-2020.html. Access date: 26.02.2025.
  • [7] Rao, B.K. and Ahn, C.S., “Dermatoscopy for melanoma and pigmented lesions”, Dermatologic Clinics, 30(3): 413–434, (2012).
  • [8] Kittler, H., Pehamberger, H., Wolff, K., and Binder, M., “Diagnostic accuracy of dermoscopy”, Lancet Oncology, 3(3): 159–165, (2002).
  • [9] Uhl, F., Græsdal Rasmussen, T., and Oppelt, N., “Classification ensembles for beach cast and drifting vegetation mapping with Sentinel-2 and PlanetScope”, Geosciences (Basel), 12(1): 15, (2021).
  • [10] Kuncheva, L.I., Whitaker, C.J., Shipp, C.A., and Duin, R.P.W., “Limits on the majority vote accuracy in classifier fusion”, Pattern Analysis and Applications, 6(1): 22–31, (2003).
  • [11] Santos, A.R.F., Aires, K.R.T., and Veras, R.M.S., “Aspects of lighting and color in classifying malignant skin cancer with deep learning”, Applied Sciences (Basel), 14(8): 3297, (2024).
  • [12] Abbas, Q., Celebi, M.E., Garcia, I.F., and Ahmad, W., “Melanoma recognition framework based on expert definition of ABCD for dermoscopic images”, Skin Research and Technology, 19(1): e93-102, (2013).
  • [13] Barata, C., Figueiredo, M.A.T., Celebi, M.E., and Marques, J.S., “Color identification in dermoscopy images using gaussian mixture models”, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing, Florence, Italy, 3611-3615, (2014).
  • [14] Shalu, N.A., Rani, R., and Kamboj, A., “Automated melanoma skin cancer detection from digital images”, International Journal of Biomedical Engineering and Technology, 37(3): 275, (2021).
  • [15] Oukil, S., Kasmi, R., Mokrani, K., and García-Zapirain, B., “Automatic segmentation and melanoma detection based on color and texture features in dermoscopic images”, Skin Research and Technology, 28(2): 203–211, (2022).
  • [16] Mamadou, D., Ayikpa, K.J., Ballo, A.B., and Kouassi, B.M., “Analysis of the impact of color spaces on skin cancer diagnosis using deep learning techniques”, Revue d'Intelligence Artificielle, 37(6): 1377–1385, (2023).
  • [17] Nigar, N., Umar, M., Shahzad, M.K., Islam, S., and Abalo, D., “A deep learning approach based on explainable artificial intelligence for skin lesion classification”, IEEE Access, 10: 113715–113725, (2022).
  • [18] Xie, Y., Zhang, J., Xia, Y., and Shen, C., “A mutual bootstrapping model for automated skin lesion segmentation and classification”, IEEE Transactions on Medical Imaging, 39(7): 2482–2493, (2020).
  • [19] Jayapriya, K. and Jacob, I.J., “Hybrid fully convolutional networks‐based skin lesion segmentation and melanoma detection using deep feature”, International Journal of Imaging Systems and Technology, 30(2): 348–357, (2020).
  • [20] Aldwgeri, A. and Abubacker, N.F., “Ensemble of deep convolutional neural network for skin lesion classification in dermoscopy images”, 6th International Visual Informatics Conference, Bangi, Malaysia,19-21, (2019).
  • [21] Li, Y. and Shen, L., “Skin lesion analysis towards melanoma detection using deep learning network”, Sensors (Basel), 18(2), (2018).
  • [22] Saravana Kumar, N.M., Hariprasath, K., Tamilselvi, S., Kavinya, A., and Kaviyavarshini, N., “Detection of stages of melanoma using deep learning”, Multimedia Tools and Applications, 80(12): 18677–18692, (2021).
  • [23] Zhang, N., Cai, Y.X., Wang, Y.Y., Tian, Y.T., Wang X.L., and Badami, B., “Skin cancer diagnosis based on optimized convolutional neural network”, Artificial Intelligence in Medicine, 102: 101756, (2020).
  • [24] Mohakud, R. and Dash, R., “Designing a grey wolf optimization based hyper-parameter optimized convolutional neural network classifier for skin cancer detection”, Journal of King Saud University - Computer and Information Sciences, 34(8): 6280–6291, (2022).
  • [25] Gu, R., Wang, L., and Zhang, L., “DE-Net: a deep edge network with boundary information for automatic skin lesion segmentation”, Neurocomputing, 468: 71–84, (2022).
  • [26] Ali, K., Shaikh, Z.A., Khan, A.A., and Laghari, A.A., “Multiclass skin cancer classification using EfficientNets – a first step towards preventing skin cancer”, Neuroscience Informatics, 2(4): 100034, (2022).
  • [27] Saba, T., Khan, M.A., Rehman, A., and Marie-Sainte, S.L., “Region extraction and classification of skin cancer: a heterogeneous framework of deep CNN features fusion and reduction”, Journal of Medical Systems, 43(9): 289, (2019).
  • [28] Jojoa Acosta, M.F., Caballero Tovar, L.Y., Garcia-Zapirain, M.B., and Percybrooks, W.S., “Melanoma diagnosis using deep learning techniques on dermatoscopic images”, BMC Medical Imaging, 21(1): 6, (2021).
  • [29] Mahum, R. and Aladhadh, S., “Skin lesion detection using hand-crafted and DL-based features fusion and LSTM”, Diagnostics (Basel), 12: 2974, (2022).
  • [30] Al-Saedi̇, D.K.A. and Savaş, S., “Classification of skin cancer with deep transfer learning method”, Computer Science, 202-210, (2022).
  • [31] Nersisson, R., Iyer, T.J., Joseph Raj, A.N., and Rajangam, V., “A dermoscopic skin lesion classification technique using YOLO-CNN and traditional feature model”, Arabian Journal for Science and Engineering, 46(10): 9797–9808, (2021).
  • [32] Guo, L., “A multimodel ensemble method using CNN and maximum correntropy criterion for basal cell carcinoma and seborrheic keratoses classification”, 2019 International Joint Conference on Neural Networks, Budapest, Hungary, 1-6, (2019).
  • [33] Oliveira, R.B., Pereira, A.S., and Tavares, J.M.R.S., “Skin lesion computational diagnosis of dermoscopic images: Ensemble models based on input feature manipulation”, Computer Methods and Programs in Biomedicine, 149: 43–53, (2017).
  • [34] Schaefer, G., Krawczyk, B., Celebi, M.E., and Iyatomi, H., “An ensemble classification approach for melanoma diagnosis”, Memetic Computing, 6(4): 233–240, (2014).
  • [35] Cao, X., Pan, J.S., Wang, Z., Sun, Z., Haq, A.U., Deng, W.A., and Yang, S., “Application of generated mask method based on mask R-CNN in classification and detection of melanoma”, Computer Methods and Programs in Biomedicine, 207: 106174, (2021).
  • [36] Harangi, B., “Skin lesion classification with ensembles of deep convolutional neural networks”, Journal of Biomedical Informatics, 86: 25–32, (2018).
  • [37] Le, D.N.T., Le, H.X., Ngo, L.T., and Ngo, H.T., “Transfer learning with class-weighted and focal loss function for automatic skin cancer classification”, arXiv, (2020).
  • [38] https://www.kaggle.com/datasets/hasnainjaved/melanoma-skin-cancer-dataset-of-10000-images. Access date: 26.02.2025.
  • [39] Hasan, M.K., Dahal, L., Samarakoon, P.N., Tushar, F.I., and Martí, R., “DSNet: automatic dermoscopic skin lesion segmentation”, Computers in Biology and Medicine, 120: 103738, (2020).
  • [40] MacDonald, L.W. and Ronnier Luo, M., “Colour imaging, vision and technology”, Color Research and Application, 27, 6: 455–455, (2002).
  • [41] Tkalcic, M. and Tasic, J.F., “Color spaces: perceptual, historical and applicational background”, The IEEE Region 8 EUROCON 2003, Computer as a Tool, Ljubljana, Slovenia, 304-308, (2003).
  • [42] Chollet, F., “Xception: Deep learning with depthwise separable convolutions”, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 1800-1807, (2017).
  • [43] Schwenker, F., “Ensemble methods: foundations and algorithms”, IEEE Computational Intelligence Magazine, 8(1): 77–79, (2013).
  • [44] He, K., Zhang, X., Ren, S., and Sun, J., “Identity mappings in deep residual networks”, 14th Computer Vision European Conference, Amsterdam, Netherlands, 630–645, (2016).
  • [45] Szegedy, C., Ioffe, S., Vanhoucke, V., and Alemi, A., “Inception-v4, Inception-ResNet and the impact of residual connections on learning”, AAAI Conference on Artificial Intelligence, San Francisco, California USA, (2017).
  • [46] Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., and Chen, L.C., “MobileNetV2: Inverted residuals and linear bottlenecks”, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 4510-4520, (2018).
  • [47] Mohammed, A. and Kora, R., “A comprehensive review on ensemble deep learning: opportunities and challenges”, Journal of King Saud University - Computer and Information Sciences, 35(2): 757–774, (2023).
  • [48] Day, W., “Consensus methods as tools for data analysis”, 1st Conference of the International Federation of Classification Societies, North-Holland, 317–324, (1987).
  • [49] Grofman, B., Owen, G., and Feld, S.L., “Thirteen theorems in search of the truth,” Theory and Decision, 15(3): 261–278, (1983).
  • [50] Kuncheva, L.I. and Alpaydin, E., “Combining pattern classifiers: methods and algorithms”, IEEE Transactions on Neural Networks, 18(3): 964–964, (2007).
  • [51] Nandi, A. and Pal, A.K., Interpreting Machine Learning Models Berkeley, CA, USA: Apress, 97–115, (2022).
  • [52] Mendonca, T., Ferreira, P.M., Marques, J.S., Marcal, A.R.S., and Rozeira, J., “PH2 - a dermoscopic image database for research and benchmarking”, Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Osaka, Japan, 5437–5440, (2013).

Ensemble Model With Color Spaces Transformations For Improving Skin Cancer Classification

Year 2025, Early View, 1 - 1
https://doi.org/10.35378/gujs.1549629

Abstract

This study introduces a novel ensemble model leveraging color space transformations for enhancing skin cancer classification accuracy. The proposed model enhances the accuracy of distinguishing between benign and malignant skin lesions by using three baseline classifiers, each specialized in a different color representation (RGB, HSI, and YCbCr), and employing majority voting decision rule. The experimental study was conducted on ISIC database using four CNN architectures; InceptionV3, ResNet101V2, InceptionResNetV2, and MobileNetV2, for three color spaces. The results reveals that the proposed model consistently outperformed three classifiers, demonstrating reduction in misclassification rates and an enhancement in the F1 score. In this study, the improvement in F1 score is approximately about 1% on the ISIC database. This achievement is obtained without applying any preprocessing. The F1 scores obtained from of the baseline classifiers and the proposed ensemble model are analyzed by the Friedman test. The generalizability of the proposed model is evaluated by conducting the same experiments on the PH2 dataset. Our findings indicate that incorporating multiple color spaces into an ensemble model can enhance classification performance, providing a promising approach for early and accurate skin cancer diagnosis.

References

  • [1] https://www.skincancer.org/skin-cancer-information/skin-cancer-facts/. Access date: 26.02.2025.
  • [2] Parkin, D.M., Mesher, D., and Sasieni, P., “13. Cancers attributable to solar (ultraviolet) radiation exposure in the UK in 2010”, British Journal of Cancer, 105(2): 66-9, (2011).
  • [3] Filimon, D.M. and Albu, A., “Skin diseases diagnosis using artificial neural networks”, 9th IEEE International Symposium on Applied Computational Intelligence and Informatics, Timisoara, Romania, 189-194, (2014).
  • [4] Hasan, M.K., Ahamad, M.A., Yap, C.H., and Yang, G., “A survey, review, and future trends of skin lesion segmentation and classification”, Computers in Biology and Medicine, 155(106624): 106624, (2023).
  • [5] Matthews, N.H., Li, W.Q., Qureshi, A.A., Weinstock, M.A., Cho, E., Cutaneous Melanoma: Etiology and Therapy, Codon Publications, Brisbane (AU), 3-22, (2017).
  • [6] https://www.cancer.org/research/cancer-facts-statistics/all-cancer-facts-figures/cancer-facts-figures-2020.html. Access date: 26.02.2025.
  • [7] Rao, B.K. and Ahn, C.S., “Dermatoscopy for melanoma and pigmented lesions”, Dermatologic Clinics, 30(3): 413–434, (2012).
  • [8] Kittler, H., Pehamberger, H., Wolff, K., and Binder, M., “Diagnostic accuracy of dermoscopy”, Lancet Oncology, 3(3): 159–165, (2002).
  • [9] Uhl, F., Græsdal Rasmussen, T., and Oppelt, N., “Classification ensembles for beach cast and drifting vegetation mapping with Sentinel-2 and PlanetScope”, Geosciences (Basel), 12(1): 15, (2021).
  • [10] Kuncheva, L.I., Whitaker, C.J., Shipp, C.A., and Duin, R.P.W., “Limits on the majority vote accuracy in classifier fusion”, Pattern Analysis and Applications, 6(1): 22–31, (2003).
  • [11] Santos, A.R.F., Aires, K.R.T., and Veras, R.M.S., “Aspects of lighting and color in classifying malignant skin cancer with deep learning”, Applied Sciences (Basel), 14(8): 3297, (2024).
  • [12] Abbas, Q., Celebi, M.E., Garcia, I.F., and Ahmad, W., “Melanoma recognition framework based on expert definition of ABCD for dermoscopic images”, Skin Research and Technology, 19(1): e93-102, (2013).
  • [13] Barata, C., Figueiredo, M.A.T., Celebi, M.E., and Marques, J.S., “Color identification in dermoscopy images using gaussian mixture models”, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing, Florence, Italy, 3611-3615, (2014).
  • [14] Shalu, N.A., Rani, R., and Kamboj, A., “Automated melanoma skin cancer detection from digital images”, International Journal of Biomedical Engineering and Technology, 37(3): 275, (2021).
  • [15] Oukil, S., Kasmi, R., Mokrani, K., and García-Zapirain, B., “Automatic segmentation and melanoma detection based on color and texture features in dermoscopic images”, Skin Research and Technology, 28(2): 203–211, (2022).
  • [16] Mamadou, D., Ayikpa, K.J., Ballo, A.B., and Kouassi, B.M., “Analysis of the impact of color spaces on skin cancer diagnosis using deep learning techniques”, Revue d'Intelligence Artificielle, 37(6): 1377–1385, (2023).
  • [17] Nigar, N., Umar, M., Shahzad, M.K., Islam, S., and Abalo, D., “A deep learning approach based on explainable artificial intelligence for skin lesion classification”, IEEE Access, 10: 113715–113725, (2022).
  • [18] Xie, Y., Zhang, J., Xia, Y., and Shen, C., “A mutual bootstrapping model for automated skin lesion segmentation and classification”, IEEE Transactions on Medical Imaging, 39(7): 2482–2493, (2020).
  • [19] Jayapriya, K. and Jacob, I.J., “Hybrid fully convolutional networks‐based skin lesion segmentation and melanoma detection using deep feature”, International Journal of Imaging Systems and Technology, 30(2): 348–357, (2020).
  • [20] Aldwgeri, A. and Abubacker, N.F., “Ensemble of deep convolutional neural network for skin lesion classification in dermoscopy images”, 6th International Visual Informatics Conference, Bangi, Malaysia,19-21, (2019).
  • [21] Li, Y. and Shen, L., “Skin lesion analysis towards melanoma detection using deep learning network”, Sensors (Basel), 18(2), (2018).
  • [22] Saravana Kumar, N.M., Hariprasath, K., Tamilselvi, S., Kavinya, A., and Kaviyavarshini, N., “Detection of stages of melanoma using deep learning”, Multimedia Tools and Applications, 80(12): 18677–18692, (2021).
  • [23] Zhang, N., Cai, Y.X., Wang, Y.Y., Tian, Y.T., Wang X.L., and Badami, B., “Skin cancer diagnosis based on optimized convolutional neural network”, Artificial Intelligence in Medicine, 102: 101756, (2020).
  • [24] Mohakud, R. and Dash, R., “Designing a grey wolf optimization based hyper-parameter optimized convolutional neural network classifier for skin cancer detection”, Journal of King Saud University - Computer and Information Sciences, 34(8): 6280–6291, (2022).
  • [25] Gu, R., Wang, L., and Zhang, L., “DE-Net: a deep edge network with boundary information for automatic skin lesion segmentation”, Neurocomputing, 468: 71–84, (2022).
  • [26] Ali, K., Shaikh, Z.A., Khan, A.A., and Laghari, A.A., “Multiclass skin cancer classification using EfficientNets – a first step towards preventing skin cancer”, Neuroscience Informatics, 2(4): 100034, (2022).
  • [27] Saba, T., Khan, M.A., Rehman, A., and Marie-Sainte, S.L., “Region extraction and classification of skin cancer: a heterogeneous framework of deep CNN features fusion and reduction”, Journal of Medical Systems, 43(9): 289, (2019).
  • [28] Jojoa Acosta, M.F., Caballero Tovar, L.Y., Garcia-Zapirain, M.B., and Percybrooks, W.S., “Melanoma diagnosis using deep learning techniques on dermatoscopic images”, BMC Medical Imaging, 21(1): 6, (2021).
  • [29] Mahum, R. and Aladhadh, S., “Skin lesion detection using hand-crafted and DL-based features fusion and LSTM”, Diagnostics (Basel), 12: 2974, (2022).
  • [30] Al-Saedi̇, D.K.A. and Savaş, S., “Classification of skin cancer with deep transfer learning method”, Computer Science, 202-210, (2022).
  • [31] Nersisson, R., Iyer, T.J., Joseph Raj, A.N., and Rajangam, V., “A dermoscopic skin lesion classification technique using YOLO-CNN and traditional feature model”, Arabian Journal for Science and Engineering, 46(10): 9797–9808, (2021).
  • [32] Guo, L., “A multimodel ensemble method using CNN and maximum correntropy criterion for basal cell carcinoma and seborrheic keratoses classification”, 2019 International Joint Conference on Neural Networks, Budapest, Hungary, 1-6, (2019).
  • [33] Oliveira, R.B., Pereira, A.S., and Tavares, J.M.R.S., “Skin lesion computational diagnosis of dermoscopic images: Ensemble models based on input feature manipulation”, Computer Methods and Programs in Biomedicine, 149: 43–53, (2017).
  • [34] Schaefer, G., Krawczyk, B., Celebi, M.E., and Iyatomi, H., “An ensemble classification approach for melanoma diagnosis”, Memetic Computing, 6(4): 233–240, (2014).
  • [35] Cao, X., Pan, J.S., Wang, Z., Sun, Z., Haq, A.U., Deng, W.A., and Yang, S., “Application of generated mask method based on mask R-CNN in classification and detection of melanoma”, Computer Methods and Programs in Biomedicine, 207: 106174, (2021).
  • [36] Harangi, B., “Skin lesion classification with ensembles of deep convolutional neural networks”, Journal of Biomedical Informatics, 86: 25–32, (2018).
  • [37] Le, D.N.T., Le, H.X., Ngo, L.T., and Ngo, H.T., “Transfer learning with class-weighted and focal loss function for automatic skin cancer classification”, arXiv, (2020).
  • [38] https://www.kaggle.com/datasets/hasnainjaved/melanoma-skin-cancer-dataset-of-10000-images. Access date: 26.02.2025.
  • [39] Hasan, M.K., Dahal, L., Samarakoon, P.N., Tushar, F.I., and Martí, R., “DSNet: automatic dermoscopic skin lesion segmentation”, Computers in Biology and Medicine, 120: 103738, (2020).
  • [40] MacDonald, L.W. and Ronnier Luo, M., “Colour imaging, vision and technology”, Color Research and Application, 27, 6: 455–455, (2002).
  • [41] Tkalcic, M. and Tasic, J.F., “Color spaces: perceptual, historical and applicational background”, The IEEE Region 8 EUROCON 2003, Computer as a Tool, Ljubljana, Slovenia, 304-308, (2003).
  • [42] Chollet, F., “Xception: Deep learning with depthwise separable convolutions”, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 1800-1807, (2017).
  • [43] Schwenker, F., “Ensemble methods: foundations and algorithms”, IEEE Computational Intelligence Magazine, 8(1): 77–79, (2013).
  • [44] He, K., Zhang, X., Ren, S., and Sun, J., “Identity mappings in deep residual networks”, 14th Computer Vision European Conference, Amsterdam, Netherlands, 630–645, (2016).
  • [45] Szegedy, C., Ioffe, S., Vanhoucke, V., and Alemi, A., “Inception-v4, Inception-ResNet and the impact of residual connections on learning”, AAAI Conference on Artificial Intelligence, San Francisco, California USA, (2017).
  • [46] Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., and Chen, L.C., “MobileNetV2: Inverted residuals and linear bottlenecks”, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 4510-4520, (2018).
  • [47] Mohammed, A. and Kora, R., “A comprehensive review on ensemble deep learning: opportunities and challenges”, Journal of King Saud University - Computer and Information Sciences, 35(2): 757–774, (2023).
  • [48] Day, W., “Consensus methods as tools for data analysis”, 1st Conference of the International Federation of Classification Societies, North-Holland, 317–324, (1987).
  • [49] Grofman, B., Owen, G., and Feld, S.L., “Thirteen theorems in search of the truth,” Theory and Decision, 15(3): 261–278, (1983).
  • [50] Kuncheva, L.I. and Alpaydin, E., “Combining pattern classifiers: methods and algorithms”, IEEE Transactions on Neural Networks, 18(3): 964–964, (2007).
  • [51] Nandi, A. and Pal, A.K., Interpreting Machine Learning Models Berkeley, CA, USA: Apress, 97–115, (2022).
  • [52] Mendonca, T., Ferreira, P.M., Marques, J.S., Marcal, A.R.S., and Rozeira, J., “PH2 - a dermoscopic image database for research and benchmarking”, Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Osaka, Japan, 5437–5440, (2013).
There are 52 citations in total.

Details

Primary Language English
Subjects Deep Learning, Neural Networks
Journal Section Research Article
Authors

Feyza Yılmaz 0000-0002-6989-2823

Rifat Edizkan 0000-0002-5125-5631

Ömer Nezih Gerek 0000-0001-8183-1356

Early Pub Date October 5, 2025
Publication Date October 23, 2025
Submission Date September 13, 2024
Acceptance Date July 25, 2025
Published in Issue Year 2025 Early View

Cite

APA Yılmaz, F., Edizkan, R., & Gerek, Ö. N. (2025). Ensemble Model With Color Spaces Transformations For Improving Skin Cancer Classification. Gazi University Journal of Science1-1. https://doi.org/10.35378/gujs.1549629
AMA Yılmaz F, Edizkan R, Gerek ÖN. Ensemble Model With Color Spaces Transformations For Improving Skin Cancer Classification. Gazi University Journal of Science. Published online October 1, 2025:1-1. doi:10.35378/gujs.1549629
Chicago Yılmaz, Feyza, Rifat Edizkan, and Ömer Nezih Gerek. “Ensemble Model With Color Spaces Transformations For Improving Skin Cancer Classification”. Gazi University Journal of Science, October (October 2025), 1-1. https://doi.org/10.35378/gujs.1549629.
EndNote Yılmaz F, Edizkan R, Gerek ÖN (October 1, 2025) Ensemble Model With Color Spaces Transformations For Improving Skin Cancer Classification. Gazi University Journal of Science 1–1.
IEEE F. Yılmaz, R. Edizkan, and Ö. N. Gerek, “Ensemble Model With Color Spaces Transformations For Improving Skin Cancer Classification”, Gazi University Journal of Science, pp. 1–1, October2025, doi: 10.35378/gujs.1549629.
ISNAD Yılmaz, Feyza et al. “Ensemble Model With Color Spaces Transformations For Improving Skin Cancer Classification”. Gazi University Journal of Science. October2025. 1-1. https://doi.org/10.35378/gujs.1549629.
JAMA Yılmaz F, Edizkan R, Gerek ÖN. Ensemble Model With Color Spaces Transformations For Improving Skin Cancer Classification. Gazi University Journal of Science. 2025;:1–1.
MLA Yılmaz, Feyza et al. “Ensemble Model With Color Spaces Transformations For Improving Skin Cancer Classification”. Gazi University Journal of Science, 2025, pp. 1-1, doi:10.35378/gujs.1549629.
Vancouver Yılmaz F, Edizkan R, Gerek ÖN. Ensemble Model With Color Spaces Transformations For Improving Skin Cancer Classification. Gazi University Journal of Science. 2025:1-.