Araştırma Makalesi
BibTex RIS Kaynak Göster

Bitki hastalığı tespiti için CAM tabanlı açıklanabilirlik yöntemlerinin yerelleştirme değerlendirmesi

Yıl 2025, Cilt: 31 Sayı: 7, 1287 - 1298, 15.12.2025
https://doi.org/10.5505/pajes.2025.50955

Öz

Son yıllarda bilgisayarla görme teknolojileri, hassas tarımda kritik bir rol oynamış, robotik ve yapay zekayı kullanarak mahsul üretiminde görevleri otomatikleştirmiştir. Görüntü tabanlı uygulamalar umut vadetse de, modelin yorumlanabilirliği önemli bir zorluk olmaya devam etmektedir. Açıklanabilir yapay zeka, bitki bilimcilerine yorumlanabilir ve güvenilir bilgiler sunarak bitki hastalıklarının anlaşılmasını geliştirmeyi hedeflemektedir. Bu çalışma, açıklanabilirlik metriklerinin model değerlendirmesine entegrasyonuna odaklanmakta ve bitki hastalığı sınıflandırma modellerine uygulanan açıklanabilirlik yöntemlerinin detaylı bir analizini sunmaktadır. EfficientNet, MobileNet, ResNet ve ShuffleNet gibi mimarilerle, açık bir bitki hastalığı veri seti üzerinde eğitilmiş Sınıf Aktivasyon Haritalama tabanlı görselleştirme yöntemleri kullanılarak hem sınıflandırma başarısı hem de modelin açıklanabilirliği değerlendirilmiştir. Lokalizasyon sonuçları, dikkat haritalarının hastalıklı bölgeleri etiketleyen sınırlayıcı kutularla ne kadar uyumlu olduğunu enerji tabanlı bir perspektiften değerlendirerek elde edilmiştir. Bulgular, görüntülerden çıkarılan özelliklerin boyutları ve konumlarının sınıflandırma sonuçlarını önemli ölçüde etkilediğini göstermektedir ve veri etiketleme aşamasında doğru anotasyonların önemini vurgulamaktadır. Bu çalışma, hastalık tespitindeki olası yanlılıkları ortaya çıkarmakta ve derin öğrenme modellerinin değerlendirilmesinde açıklanabilirlik metriklerinin gerekliliğini vurgulayarak, bitki hastalıklarının daha doğru ve verimli bir şekilde tespit edilmesi için derin öğrenme tekniklerinin optimize edilmesine zemin hazırlamaktadır.

Kaynakça

  • [1] Da Silveira F, Lerme, FH, Amaral FG. “An overview of agriculture 4.0 development: systematic review of descriptions, technologies, barriers, advantages, and disadvantages”. Computers and Electronics in Agriculture, 189, 106405, 2021.
  • [2] Albahar M. “A survey on deep learning and its impact on agriculture: challenges and opportunities”. Agriculture, 13(3), 540, 2023.
  • [3] Saranya T, Deisy C, Sridevi S, Anbananthen KSM. “A comparative study of deep learning and internet of things for precision agriculture”. Engineering Applications of Artificial Intelligence, 122, 106034, 2023.
  • [4] Farjon G, Liu H, Yael E. "Deep-learning-based counting methods, datasets, and applications in agriculture: A review." Precision Agriculture, 24, 1683-1711, 2023.
  • [5] Chakraborty SK, Chandel NS, Jat D, Tiwari MK, Rajwade YA, Subeesh, A. "Deep learning approaches and interventions for futuristic engineering in agriculture." Neural Computing and Applications, 34, 20539-20573, 2022.
  • [6] Ahmad A, Saraswat D, El Gamal A. “A survey on using deep learning techniques for plant disease diagnosis and recommendations for development of appropriate tools”. Smart Agricultural Technology, 3, 100083, 2023.
  • [7] Abade A, Ferreira PA, de Barros Vidal F. “Plant diseases recognition on images using convolutional neural networks: a systematic review”. Computers and Electronics in Agriculture, 185, 106125, 2021.
  • [8] Ding W, Abdel-Basset M, Hawash H, Ali AM. “Explainability of artificial intelligence methods, applications and challenges: A comprehensive survey”. Information Sciences. 615, 238-292, 2022.
  • [9] Toda Y, Okura F. “How convolutional neural networks diagnose plant disease”. Plant Phenomics, 19, 9237136, 2019.
  • [10] Ghosal S, Blystone D, Singh AK, Ganapathysubramanian B, Singh A, Sarkar S. “An explainable deep machine vision framework for plant stress phenotyping”. Proceedings of the National Academy of Sciences, 115(18), 4613-4618, 2018.
  • [11] Nagasubramanian K, Jones S, Singh AK, Sarkar S, Singh A, Ganapathysubramanian B. “Plant disease identification using explainable 3D deep learning on hyperspectral images”. Plant Methods, 15, 1-10, 2019.
  • [12] Brahimi M, Mahmoudi S, Boukhalfa K, Moussaoui A. “Deep interpretable architecture for plant diseases classification”. 2019 IEEE Signal Processing: Algorithms, Architectures, Arrangements, and Applications, Poznan, Poland, 18-20 September 2019.
  • [13] Arvind C, Totla A, Jain T, Sinha N, Jyothi R, Aditya K, Farhan M, Sumukh G, Ak G. “Deep Learning Based Plant Disease Classification with Explainable AI and Mitigation Recommendation”. 2021 IEEE Symposium Series on Computational Intelligence, 1-8, Orlando, USA, 05-07 December 2021.
  • [14] Batchuluun G, Nam SH, Park KR. “Deep learning-based plant classification and crop disease classification by thermal camera”. Journal of King Saud University-Computer and Information Sciences, 34(10), 10474-10486, 2022.
  • [15] Wei K, Chen B, Zhang J, Fan S, Wu K, Liu G, Chen D. “Explainable deep learning study for leaf disease classification”. Agronomy, 12(5), 1035, 2022.
  • [16] Shah D, Trivedi V, Sheth V, Shah A, Chauhan U. “ResTS: Residual deep interpretable architecture for plant disease detection”. Information Processing in Agriculture, 9(2), 212-223, 2022.
  • [17] Shoaib M, Shah B, Sayed N, Ali F, Ullah R, Hussain I. “Deep learning for plant bioinformatics: an explainable gradient-based approach for disease detection”. Frontiers in Plant Science, 14, 1-17, 2023.
  • [18] Thakur P S, Chaturvedi S, Khanna P, Sheorey T, Ojha A. “Vision transformer meets convolutional neural network for plant disease classification”. Ecological Informatics, 77, 102245, 2023.
  • [19] Chhetri TR, Hohenegger A, Fensel A, Kasali MA, Adekunle AA. “Towards improving prediction accuracy and user-level explainability using deep learning and knowledge graphs: A study on cassava disease”. Expert Systems with Applications, 233, 120955, 2023.
  • [20] Ghosh P, Mondal AK, Chatterjee S, Masud M, Meshref H, Bairagi AK. “Recognition of sunflower diseases using hybrid deep learning and its explainability with AI”. Mathematics, 11(10), 2241, 2023.
  • [21] Yang S, Xing Z, Wang H, Gao X, Dong X, Yao Y, Zhang R, Zhang X, Li S, Zhao Y, Liu Z. “Classification and localization of maize leaf spot disease based on weakly supervised learning”. Frontiers in Plant Science, 14, 1128399, 2023.
  • [22] Moupojou E, Tagne A, Retraint F, Tadonkemwa A, Wilfried D, Tapamo H, Nkenlifack M. “FieldPlant: A dataset of field plant images for plant disease detection and classification with deep learning”. IEEE Access, 11, 35398-35410, 2023.
  • [23] Karahan T, Nabiyev V. “Plant identification with convolutional neural networks and transfer learning”. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, 27(5), 638-645, 2021
  • [24] Tan M, Quoc L. “Efficientnet: Rethinking model scaling for convolutional neural networks”. 2019 International Conference on Machine Learning, Long Beach, California, 9-15 June 2019.
  • [25] Sandler M, Howard A, Zhu M, Zhmoginov A, Chen C. “MobileNetV2: Inverted residuals and linear bottlenecks”. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, USA, 18-23 June 2018.
  • [26] He K, Zhang X, Ren S, Sun J. "Deep residual learning for image recognition," 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, USA, 27-30 June 2016.
  • [27] Zhang X, Zhou X, Lin M, J. Sun, "ShuffleNet: An extremely efficient convolutional neural network for mobile devices". 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, USA, 18-23 June 2018.
  • [28] OpenMMLab. “MMPreTrain”. https://github.com/open-mmlab/mmpretrain (08.10.2024).
  • [29] Selvaraju RR, Cogswell M, Das A, Vedantam R, Parikh D, Batra D. “Grad-CAM: visual explanations from deep networks via gradient-based localization”. International Journal of Computer Vision, 128, 336-359, 2020.
  • [30] Chattopadhay A, Sarkar A, Howlader P, Balasubramanian VN. “Grad-cam++: Generalized gradient-based visual explanations for deep convolutional networks”. 2018 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Tahoe, USA, 12-15 March 2018.
  • [31] Wang H, Wang Z, Du M, Yang F, Zhang Z, Ding S, Mardziel, P, Hu X. “Score-CAM: Score-Weighted visual explanations for convolutional neural networks”. 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 24-25, Seattle, USA, 14-19 June 2020.
  • [32] Muhammad MB, Yeasin M. "Eigen-cam: Class activation map using principal components". 2020 IEEE International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19-24 July 2020.

Localization evaluation of CAM based explainability techniques for plant disease detection

Yıl 2025, Cilt: 31 Sayı: 7, 1287 - 1298, 15.12.2025
https://doi.org/10.5505/pajes.2025.50955

Öz

In recent years, computer vision technologies have played a critical role in precision agriculture, leveraging robotics and artificial intelligence to automate tasks in crop production. While image-based applications hold promise, model interpretability remains a significant challenge. Explainable artificial intelligence aims to address this by providing plant scientists with interpretable, reliable information, improving the understanding of plant diseases. This study focuses on integrating explainability metrics into model evaluation, with a detailed analysis of explainability methods applied to plant disease classification models. Using Class Activation Mapping based visualization methods with architectures such as EfficientNet, MobileNet, ResNet, and ShuffleNet, trained on a public plant disease dataset, the study assessed both classification success and model explainability. Localization results were derived from an energy-based perspective, assessing how well saliency maps aligned with bounding boxes of diseased areas. The findings reveal that feature dimensions and positions in the images significantly influence classification outcomes, highlighting the importance of precise annotations during data labeling. This study uncovers potential biases in disease detection and emphasizes the need for explainability metrics in evaluating deep learning models, paving the way for more accurate and efficient plant disease detection techniques.

Kaynakça

  • [1] Da Silveira F, Lerme, FH, Amaral FG. “An overview of agriculture 4.0 development: systematic review of descriptions, technologies, barriers, advantages, and disadvantages”. Computers and Electronics in Agriculture, 189, 106405, 2021.
  • [2] Albahar M. “A survey on deep learning and its impact on agriculture: challenges and opportunities”. Agriculture, 13(3), 540, 2023.
  • [3] Saranya T, Deisy C, Sridevi S, Anbananthen KSM. “A comparative study of deep learning and internet of things for precision agriculture”. Engineering Applications of Artificial Intelligence, 122, 106034, 2023.
  • [4] Farjon G, Liu H, Yael E. "Deep-learning-based counting methods, datasets, and applications in agriculture: A review." Precision Agriculture, 24, 1683-1711, 2023.
  • [5] Chakraborty SK, Chandel NS, Jat D, Tiwari MK, Rajwade YA, Subeesh, A. "Deep learning approaches and interventions for futuristic engineering in agriculture." Neural Computing and Applications, 34, 20539-20573, 2022.
  • [6] Ahmad A, Saraswat D, El Gamal A. “A survey on using deep learning techniques for plant disease diagnosis and recommendations for development of appropriate tools”. Smart Agricultural Technology, 3, 100083, 2023.
  • [7] Abade A, Ferreira PA, de Barros Vidal F. “Plant diseases recognition on images using convolutional neural networks: a systematic review”. Computers and Electronics in Agriculture, 185, 106125, 2021.
  • [8] Ding W, Abdel-Basset M, Hawash H, Ali AM. “Explainability of artificial intelligence methods, applications and challenges: A comprehensive survey”. Information Sciences. 615, 238-292, 2022.
  • [9] Toda Y, Okura F. “How convolutional neural networks diagnose plant disease”. Plant Phenomics, 19, 9237136, 2019.
  • [10] Ghosal S, Blystone D, Singh AK, Ganapathysubramanian B, Singh A, Sarkar S. “An explainable deep machine vision framework for plant stress phenotyping”. Proceedings of the National Academy of Sciences, 115(18), 4613-4618, 2018.
  • [11] Nagasubramanian K, Jones S, Singh AK, Sarkar S, Singh A, Ganapathysubramanian B. “Plant disease identification using explainable 3D deep learning on hyperspectral images”. Plant Methods, 15, 1-10, 2019.
  • [12] Brahimi M, Mahmoudi S, Boukhalfa K, Moussaoui A. “Deep interpretable architecture for plant diseases classification”. 2019 IEEE Signal Processing: Algorithms, Architectures, Arrangements, and Applications, Poznan, Poland, 18-20 September 2019.
  • [13] Arvind C, Totla A, Jain T, Sinha N, Jyothi R, Aditya K, Farhan M, Sumukh G, Ak G. “Deep Learning Based Plant Disease Classification with Explainable AI and Mitigation Recommendation”. 2021 IEEE Symposium Series on Computational Intelligence, 1-8, Orlando, USA, 05-07 December 2021.
  • [14] Batchuluun G, Nam SH, Park KR. “Deep learning-based plant classification and crop disease classification by thermal camera”. Journal of King Saud University-Computer and Information Sciences, 34(10), 10474-10486, 2022.
  • [15] Wei K, Chen B, Zhang J, Fan S, Wu K, Liu G, Chen D. “Explainable deep learning study for leaf disease classification”. Agronomy, 12(5), 1035, 2022.
  • [16] Shah D, Trivedi V, Sheth V, Shah A, Chauhan U. “ResTS: Residual deep interpretable architecture for plant disease detection”. Information Processing in Agriculture, 9(2), 212-223, 2022.
  • [17] Shoaib M, Shah B, Sayed N, Ali F, Ullah R, Hussain I. “Deep learning for plant bioinformatics: an explainable gradient-based approach for disease detection”. Frontiers in Plant Science, 14, 1-17, 2023.
  • [18] Thakur P S, Chaturvedi S, Khanna P, Sheorey T, Ojha A. “Vision transformer meets convolutional neural network for plant disease classification”. Ecological Informatics, 77, 102245, 2023.
  • [19] Chhetri TR, Hohenegger A, Fensel A, Kasali MA, Adekunle AA. “Towards improving prediction accuracy and user-level explainability using deep learning and knowledge graphs: A study on cassava disease”. Expert Systems with Applications, 233, 120955, 2023.
  • [20] Ghosh P, Mondal AK, Chatterjee S, Masud M, Meshref H, Bairagi AK. “Recognition of sunflower diseases using hybrid deep learning and its explainability with AI”. Mathematics, 11(10), 2241, 2023.
  • [21] Yang S, Xing Z, Wang H, Gao X, Dong X, Yao Y, Zhang R, Zhang X, Li S, Zhao Y, Liu Z. “Classification and localization of maize leaf spot disease based on weakly supervised learning”. Frontiers in Plant Science, 14, 1128399, 2023.
  • [22] Moupojou E, Tagne A, Retraint F, Tadonkemwa A, Wilfried D, Tapamo H, Nkenlifack M. “FieldPlant: A dataset of field plant images for plant disease detection and classification with deep learning”. IEEE Access, 11, 35398-35410, 2023.
  • [23] Karahan T, Nabiyev V. “Plant identification with convolutional neural networks and transfer learning”. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, 27(5), 638-645, 2021
  • [24] Tan M, Quoc L. “Efficientnet: Rethinking model scaling for convolutional neural networks”. 2019 International Conference on Machine Learning, Long Beach, California, 9-15 June 2019.
  • [25] Sandler M, Howard A, Zhu M, Zhmoginov A, Chen C. “MobileNetV2: Inverted residuals and linear bottlenecks”. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, USA, 18-23 June 2018.
  • [26] He K, Zhang X, Ren S, Sun J. "Deep residual learning for image recognition," 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, USA, 27-30 June 2016.
  • [27] Zhang X, Zhou X, Lin M, J. Sun, "ShuffleNet: An extremely efficient convolutional neural network for mobile devices". 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, USA, 18-23 June 2018.
  • [28] OpenMMLab. “MMPreTrain”. https://github.com/open-mmlab/mmpretrain (08.10.2024).
  • [29] Selvaraju RR, Cogswell M, Das A, Vedantam R, Parikh D, Batra D. “Grad-CAM: visual explanations from deep networks via gradient-based localization”. International Journal of Computer Vision, 128, 336-359, 2020.
  • [30] Chattopadhay A, Sarkar A, Howlader P, Balasubramanian VN. “Grad-cam++: Generalized gradient-based visual explanations for deep convolutional networks”. 2018 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Tahoe, USA, 12-15 March 2018.
  • [31] Wang H, Wang Z, Du M, Yang F, Zhang Z, Ding S, Mardziel, P, Hu X. “Score-CAM: Score-Weighted visual explanations for convolutional neural networks”. 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 24-25, Seattle, USA, 14-19 June 2020.
  • [32] Muhammad MB, Yeasin M. "Eigen-cam: Class activation map using principal components". 2020 IEEE International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19-24 July 2020.
Toplam 32 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Görüntü İşleme, Örüntü Tanıma
Bölüm Araştırma Makalesi
Yazarlar

Duygu Sinanç Terzi 0000-0002-3332-9414

Gönderilme Tarihi 1 Ekim 2024
Kabul Tarihi 20 Mayıs 2025
Erken Görünüm Tarihi 2 Kasım 2025
Yayımlanma Tarihi 15 Aralık 2025
Yayımlandığı Sayı Yıl 2025 Cilt: 31 Sayı: 7

Kaynak Göster

APA Sinanç Terzi, D. (2025). Localization evaluation of CAM based explainability techniques for plant disease detection. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, 31(7), 1287-1298. https://doi.org/10.5505/pajes.2025.50955
AMA Sinanç Terzi D. Localization evaluation of CAM based explainability techniques for plant disease detection. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi. Aralık 2025;31(7):1287-1298. doi:10.5505/pajes.2025.50955
Chicago Sinanç Terzi, Duygu. “Localization evaluation of CAM based explainability techniques for plant disease detection”. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi 31, sy. 7 (Aralık 2025): 1287-98. https://doi.org/10.5505/pajes.2025.50955.
EndNote Sinanç Terzi D (01 Aralık 2025) Localization evaluation of CAM based explainability techniques for plant disease detection. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi 31 7 1287–1298.
IEEE D. Sinanç Terzi, “Localization evaluation of CAM based explainability techniques for plant disease detection”, Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, c. 31, sy. 7, ss. 1287–1298, 2025, doi: 10.5505/pajes.2025.50955.
ISNAD Sinanç Terzi, Duygu. “Localization evaluation of CAM based explainability techniques for plant disease detection”. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi 31/7 (Aralık2025), 1287-1298. https://doi.org/10.5505/pajes.2025.50955.
JAMA Sinanç Terzi D. Localization evaluation of CAM based explainability techniques for plant disease detection. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi. 2025;31:1287–1298.
MLA Sinanç Terzi, Duygu. “Localization evaluation of CAM based explainability techniques for plant disease detection”. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, c. 31, sy. 7, 2025, ss. 1287-98, doi:10.5505/pajes.2025.50955.
Vancouver Sinanç Terzi D. Localization evaluation of CAM based explainability techniques for plant disease detection. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi. 2025;31(7):1287-98.