Araştırma Makalesi
BibTex RIS Kaynak Göster

Instance Segmentation of Sunflower Inflorescences from RGB and Multispectral UAV Orthomosaics with Mask R-CNN Deep Neural Network

Yıl 2025, Cilt: 6 Sayı: 1, 57 - 72, 26.03.2025

Öz

Sunflower is an important oilseed source for our country, distinguished from other agricultural plants by its large and unique inflorescence. Accurately detecting sunflower inflorescence is crucial for yield estimation and sustainable production planning. The main goal of this study is to evaluate the performance of the deep learning-based Mask R-CNN model in detecting sunflower inflorescences from RGB and multispectral UAV orthomosaics. For this purpose, UAV aerial photos were captured over the study area in Arifiye district of Sakarya, and high-resolution orthomosaics were produced. The Mask R-CNN model with a ResNet-50 backbone was trained and tested on manually labeled datasets from RGB and multispectral UAV orthomosaics. The results indicate that the model achieved higher accuracy with multispectral data, reaching an F1 Score of 96.73% and an Average Precision of 94.74% at a 0.50 intersection over union threshold, which measures the overlap between predicted and actual detections. With RGB data, the model performed best at a 0.75 threshold, obtaining an F1 Score of 78.19% and an Average Precision of 63.50%. These findings suggest that multispectral UAV data are more effective in detecting sunflower inflorescences, while RGB UAV data provide better localization accuracy.

Kaynakça

  • Albahar, M. (2023). A survey on deep learning and its impact on agriculture: Challenges and opportunities. Agriculture, 13(3), Article 540. https://doi.org/10.3390/agriculture13030540
  • Badrinarayanan, V., Kendall, A., & Cipolla, R. (2017). SegNet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(12), 2481–2495.
  • Bakhshipour, A., Sanaeifar, A., Payman, S. H., & de la Guardia, M. (2018). Evaluation of data mining strategies for classification of black tea based on image-based features. Food Analytical Methods, 11, 1041–1050.
  • Bouguettaya, A., Zarzour, H., Kechida, A., & Taberkit, A. M. (2022). Deep learning techniques to classify agricultural crops through UAV imagery: A review. Neural Computing and Applications, 34(12), 9511–9536.
  • Centorame, L., Gasperini, T., Ilari, A., Del Gatto, A., & Foppa Pedretti, E. (2024b). An overview of machine learning applications on plant phenotyping, with a focus on sunflower. Agronomy, 14(4), Article 719. https://doi.org/10.3390/agronomy14040719
  • Centorame, L., Ilari, A., Del Gatto, A., & Pedretti, E. F. (2024a). A systematic review on precision agriculture applied to sunflowers, the role of hyperspectral imaging. Computers and Electronics in Agriculture, 222, Article 109097. https://doi.org/10.1016/j.compag.2024.109097
  • Ding, R., Luo, J., Wang, C., Yu, L., Yang, J., Wang, M., Zhong, S., & Gu, R. (2023). Identifying and mapping individual medicinal plant Lamiophlomis rotata at high elevations by using unmanned aerial vehicles and deep learning. Plant Methods, 19, Article 38. https://doi.org/10.1186/s13007-023-01015-z
  • Food and Agriculture Organization. (2022). Crops and livestock products. https://www.fao.org/faostat/en/#data/QCL
  • Fromm, M., Schubert, M., Castilla, G., Linke, J., & McDermid, G. (2019). Automated detection of conifer seedlings in drone imagery using convolutional neural networks. Remote Sensing, 11(21), Article 2585. https://doi.org/10.3390/rs11212585
  • Gao, X., Zan, X., Yang, S., Zhang, R., Chen, S., Zhang, X., Liu, Z., Ma, Y., Zhao, Y., & Li, S. (2023). Maize seedling information extraction from UAV images based on semi-automatic sample generation and Mask R-CNN model. European Journal of Agronomy, 147, Article 126845. https://doi.org/10.1016/j.eja.2023.126845
  • Gao, Z., Luo, Z., Zhang, W., Lv, Z., & Xu, Y. (2020). Deep learning application in plant stress imaging: A review. AgriEngineering, 2(3), Article 29. https://doi.org/10.3390/agriengineering2030029
  • Gibril, M. B. A., Shafri, H. Z. M., Shanableh, A., Al-Ruzouq, R., bin Hashim, S. J., Wayayok, A., & Sachit, M. S. (2024). Large-scale assessment of date palm plantations based on UAV remote sensing and multiscale vision transformer. Remote Sensing Applications: Society and Environment, 34, Article 101195. https://doi.org/10.1016/j.rsase.2024.101195
  • Guo, A., Huang, W., Dong, Y., Ye, H., Ma, H., Liu, B., Wu, W., Ren, Y., Ruan, C., & Geng, Y. (2021). Wheat yellow rust detection using UAV-based hyperspectral technology. Remote Sensing, 13(1), Article 123. https://doi.org/10.3390/rs13010123
  • He, K., Zhang, X., Ren, S., & Sun, J. (2016, June 27–30). Deep residual learning for image recognition [Conference presentation]. IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA.
  • He, K., Gkioxari, G., Dollár, P., & Girshick, R. (2017, October 22–29). Mask R-CNN [Conference presentation]. IEEE International Conference on Computer Vision, Venice, Italy.
  • Huang, Y., Reddy, K. N., Fletcher, R. S., & Pennington, D. (2018). UAV low-altitude remote sensing for precision weed management. Weed Technology, 32(1), 2–6. https://doi.org/10.1017/wet.2017.89
  • International Society of Precision Agriculture. (2024). Precision agriculture definition. https://www.ispag.org/about/ definition
  • Jing, R., Niu, Q., Tian, Y., Zhang, H., Zhao, Q., Li, Z., Zhou, X., & Li, D. (2024). Sunflower-YOLO: Detection of sunflower capitula in UAV remote sensing images. European Journal of Agronomy, 160, Article 127332. https://doi.org/10.1016/j.eja.2024.127332
  • Jurišić, M., Radočaj, D., Plaščak, I., Galić Subašić, D., & Petrović, D. (2022). The evaluation of the RGB and multispectral camera on the unmanned aerial vehicle (UAV) for the machine learning classification of maize. Poljoprivreda, 28(2), 74–80. https://doi.org/10.18047/poljo.28.2.10
  • Kamilaris, A., & Prenafeta-Boldú, F. X. (2018). Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 147, 70–90. https://doi.org/10.1016/j.compag.2018.02.016
  • Karami, A., Quijano, K., & Crawford, M. (2021). Advancing tassel detection and counting: Annotation and algorithms. Remote Sensing, 13(15), Article 2881. https://doi.org/10.3390/rs13152881
  • Kerkech, M., Hafiane, A., & Canals, R. (2018). Deep learning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Computers and Electronics in Agriculture, 155, 237–243.
  • Li, J., Li, Y., Qiao, J., Li, L., Wang, X., Yao, J., & Liao, G. (2023). Automatic counting of rapeseed inflorescences using deep learning method and UAV RGB imagery. Frontiers in Plant Science, 14, Article 1101143. https://doi.org/10.3389/fpls.2023.1101143
  • Maxwell, A. E., Warner, T. A., & Guillén, L. A. (2021). Accuracy assessment in convolutional neural network-based deep learning remote sensing studies—Part 1: Literature review. Remote Sensing, 13(13), Article 2450. https://doi.org/10.3390/rs13132450
  • Padilla, R., Passos, W. L., Dias, T. L., Netto, S. L., & Da Silva, E. A. (2021). A comparative analysis of object detection metrics with a companion open-source toolkit. Electronics, 10(3), Article 279. https://doi.org/10.3390/electronics10030279
  • Poleshchenko, D., Mikhailov, I., & Petrov, V. (2023, March 29–31). On the segmentation of sunflower plants in UAV photos [Conference presentation]. 2023 25th International Conference on Digital Signal Processing and its Applications (DSPA 2023), Moscow, Russian Federation.
  • Popescu, D., Stoican, F., Stamatescu, G., Ichim, L., & Dragana, C. (2020). Advanced UAV–WSN system for intelligent monitoring in precision agriculture. Sensors, 20(3), Article 817. https://doi.org/10.3390/s20030817
  • Qureshi, S., Ameer, A., Zia, A., Latif, A., & Karrila, S. (2023). Comparing the effectiveness of classic Mask RCNN and vision transformer in early weed detection. SSRN. http://dx.doi.org/10.2139/ssrn.4436132
  • Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016, June 27-30). You only look once: Unified, real-time object detection [Conference presentation]. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.
  • Sanaeifar, A., Guindo, M. L., Bakhshipour, A., Fazayeli, H., Li, X., & Yang, C. (2023). Advancing precision agriculture: The potential of deep learning for cereal plant head detection. Computers and Electronics in Agriculture, 209, Article 107875. https://doi.org/10.1016/j.compag.2023.107875
  • Schneiter, A. A., & Miller, J. F. (1981). Description of sunflower growth stages. Crop Science, 21(6), 901–903.
  • Sefercik, U. G., Nazar, M., Aydin, I., Büyüksalih, G., Gazioglu, C., & Bayirhan, I. (2024). Comparative analyses for determining shallow water bathymetry potential of multispectral UAVs: Case study in Tavşan Island, Sea of Marmara. Frontiers in Marine Science, 11, Article 1388704. https://doi.org/10.3389/fmars.2024.1388704
  • Seiler, G. J. (1997). Anatomy and morphology of sunflower. In A. A. Schneiter (Ed.), Sunflower technology and production (pp. 67–111). American Society of Agronomy, Crop Science Society of America, Soil Science Society of America.
  • Singh, P., Pandey, P. C., Petropoulos, G. P., Pavlides, A., Srivastava, P. K., Koutsias, N., Deng, K. A. K., & Bao, Y. (2020). Hyperspectral remote sensing in precision agriculture: Present status, challenges, and future trends. In P. C. Pandey, P. K. Srivastava, H. Balzter, B. Bhattacharya, & G. P. Petropoulos (Eds.), Hyperspectral remote sensing (pp. 121–146). Elsevier.
  • Smith, L. N. (2017, March 24–31). Cyclical learning rates for training neural networks [Conference presentation]. 2017 IEEE Winter Conference on Applications of Computer Vision (WACV), Santa Rosa, CA, USA.
  • Song, Z., Wang, P., Zhang, Z., Yang, S., & Ning, J. (2023). Recognition of sunflower growth period based on deep learning from UAV remote sensing images. Precision Agriculture, 24(4), 1417–1438.
  • T.C. Çevre, Şehircilik ve İklim Değişikliği Bakanlığı. (2024). Sakarya İli 2023 Yılı Çevre Durum Raporu. https://webdosya.csb.gov.tr/db/ced/icerikler/sakarya-ilcdr-2023-1-20240614095821.pdf
  • T.C. Tarım ve Orman Bakanlığı. (2007). Ayçiçeği tarımı. https://arastirma.tarimorman.gov.tr/etae/Belgeler/ EgitimBrosur/2482012115736884.pdf
  • Tarımsal Ekonomi ve Politika Geliştirme Enstitüsü Müdürlüğü. (2023). 2023 durum tahmin raporları. https://arastirma.tarimorman.gov.tr/tepge/Belgeler/PDF%20Durum-Tahmin%20Raporları/2023%20Durum-Tahmin%20Raporları/Ayçiçeği%20Durum-Tahmin%20Raporu%202023-392%20TEPGE.pdf
  • Türkiye İstatistik Kurumu. (2023). Merkezi dağıtım sistemi. https://biruni.tuik.gov.tr/medas/?locale=tr
  • Xiao, J., Aggarwal, A. K., Rage, U. K., Katiyar, V., & Avtar, R. (2023). Deep learning-based spatiotemporal fusion of unmanned aerial vehicle and satellite reflectance images for crop monitoring. IEEE Access, 11, 85600–85614.
  • Yang, Q., Shi, L., Han, J., Zha, Y., & Zhu, P. (2019). Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Research, 235, 142–153.
  • Yang, X., Ye, Y., Li, X., Lau, R. Y., Zhang, X., & Huang, X. (2018). Hyperspectral image classification with deep learning models. IEEE Transactions on Geoscience and Remote Sensing, 56(9), 5408–5423.
  • Yildirim, E., & Kavzoglu, T. (2025). Deep convolutional neural networks for ship detection using refined DOTA and TGRS-HRRSD high-resolution image datasets. Advances in Space Research, 75(2), 1871–1887. https://doi.org/10.1016/j.asr.2024.10.028
  • Yildirim, E., Colkesen, I., & Sefercik, U. G. (2024a, July 9–10). Identification of sunflowers (Helianthus annuus L.) from multi-temporal UAV orthomosaics using deep learning models [Conference presentation]. 9th Advanced Engineering Days, Tabriz, Iran.
  • Yildirim, E., Sefercik, U. G., & Kavzoglu, T. (2024b). Automated identification of vehicles in very high-resolution UAV orthomosaics using YOLOv7 deep learning model. Turkish Journal of Electrical Engineering and Computer Sciences, 32(1), 144–165. https://doi.org/10.55730/1300-0632.4060
  • Zheng, H., Zhou, X., He, J., Yao, X., Cheng, T., Zhu, Y., Cao, W., & Tian, Y. (2020). Early season detection of rice plants using RGB, NIR-G-B and multispectral images from unmanned aerial vehicle (UAV). Computers and Electronics in Agriculture, 169, Article 105223. https://doi.org/10.1016/j.compag.2020.105223

Ayçiçeği Çiçek Tablalarının Mask R-CNN Derin Sinir Ağı ile RGB ve Multispektral İHA Ortomozaiklerinden Örnek Segmentasyonu

Yıl 2025, Cilt: 6 Sayı: 1, 57 - 72, 26.03.2025

Öz

Ayçiçeği, ülkemiz için önemli bir yağlı tohum kaynağı olup, büyük ve kendine özgü çiçek tablasıyla diğer tarımsal bitkilerden ayrılır. Ayçiçeği tablasının doğru bir şekilde tespit edilmesi, verim tahmini ve sürdürülebilir üretim planlaması açısından çok önemlidir. Bu çalışmanın temel amacı, derin öğrenme tabanlı Mask R-CNN modelinin RGB ve multispektral İHA ortomozaiklerinden ayçiçeği çiçek tablasını tespit etme performansının değerlendirilmesidir. Bu amaçla, Sakarya'nın Arifiye ilçesindeki çalışma alanı üzerinde İHA hava fotoğrafları toplanmış ve yüksek çözünürlüklü ortomozaikler üretilmiştir. ResNet-50 omurgalı Maske R-CNN modeli, RGB ve multispektral İHA ortomozaiklerinden manuel olarak etiketlenmiş veri kümeleri üzerinde eğitilmiş ve test edilmiştir. Sonuçlar, modelin multispektral verilerle daha yüksek doğruluk elde ettiğini, tahmin edilen ve gerçek tespitler arasındaki örtüşmeyi ölçen 0,50 birleşim üzerinde kesişim eşiğinde %94,74 Ortalama Hassasiyete ve %96,73 F1 Skoruna ulaştığını göstermektedir. RGB verileriyle, model en iyi performansı 0,75 eşiğinde göstererek %78,19 F1 Skoru ve %63,50 Ortalama Hassasiyet elde etmiştir. Bu bulgular, multispektral İHA verilerinin ayçiçeği çiçek tablalarının tespit edilmesinde daha etkili olduğunu, RGB İHA verilerinin ise daha iyi lokalizasyon doğruluğu sağladığını göstermektedir.

Kaynakça

  • Albahar, M. (2023). A survey on deep learning and its impact on agriculture: Challenges and opportunities. Agriculture, 13(3), Article 540. https://doi.org/10.3390/agriculture13030540
  • Badrinarayanan, V., Kendall, A., & Cipolla, R. (2017). SegNet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(12), 2481–2495.
  • Bakhshipour, A., Sanaeifar, A., Payman, S. H., & de la Guardia, M. (2018). Evaluation of data mining strategies for classification of black tea based on image-based features. Food Analytical Methods, 11, 1041–1050.
  • Bouguettaya, A., Zarzour, H., Kechida, A., & Taberkit, A. M. (2022). Deep learning techniques to classify agricultural crops through UAV imagery: A review. Neural Computing and Applications, 34(12), 9511–9536.
  • Centorame, L., Gasperini, T., Ilari, A., Del Gatto, A., & Foppa Pedretti, E. (2024b). An overview of machine learning applications on plant phenotyping, with a focus on sunflower. Agronomy, 14(4), Article 719. https://doi.org/10.3390/agronomy14040719
  • Centorame, L., Ilari, A., Del Gatto, A., & Pedretti, E. F. (2024a). A systematic review on precision agriculture applied to sunflowers, the role of hyperspectral imaging. Computers and Electronics in Agriculture, 222, Article 109097. https://doi.org/10.1016/j.compag.2024.109097
  • Ding, R., Luo, J., Wang, C., Yu, L., Yang, J., Wang, M., Zhong, S., & Gu, R. (2023). Identifying and mapping individual medicinal plant Lamiophlomis rotata at high elevations by using unmanned aerial vehicles and deep learning. Plant Methods, 19, Article 38. https://doi.org/10.1186/s13007-023-01015-z
  • Food and Agriculture Organization. (2022). Crops and livestock products. https://www.fao.org/faostat/en/#data/QCL
  • Fromm, M., Schubert, M., Castilla, G., Linke, J., & McDermid, G. (2019). Automated detection of conifer seedlings in drone imagery using convolutional neural networks. Remote Sensing, 11(21), Article 2585. https://doi.org/10.3390/rs11212585
  • Gao, X., Zan, X., Yang, S., Zhang, R., Chen, S., Zhang, X., Liu, Z., Ma, Y., Zhao, Y., & Li, S. (2023). Maize seedling information extraction from UAV images based on semi-automatic sample generation and Mask R-CNN model. European Journal of Agronomy, 147, Article 126845. https://doi.org/10.1016/j.eja.2023.126845
  • Gao, Z., Luo, Z., Zhang, W., Lv, Z., & Xu, Y. (2020). Deep learning application in plant stress imaging: A review. AgriEngineering, 2(3), Article 29. https://doi.org/10.3390/agriengineering2030029
  • Gibril, M. B. A., Shafri, H. Z. M., Shanableh, A., Al-Ruzouq, R., bin Hashim, S. J., Wayayok, A., & Sachit, M. S. (2024). Large-scale assessment of date palm plantations based on UAV remote sensing and multiscale vision transformer. Remote Sensing Applications: Society and Environment, 34, Article 101195. https://doi.org/10.1016/j.rsase.2024.101195
  • Guo, A., Huang, W., Dong, Y., Ye, H., Ma, H., Liu, B., Wu, W., Ren, Y., Ruan, C., & Geng, Y. (2021). Wheat yellow rust detection using UAV-based hyperspectral technology. Remote Sensing, 13(1), Article 123. https://doi.org/10.3390/rs13010123
  • He, K., Zhang, X., Ren, S., & Sun, J. (2016, June 27–30). Deep residual learning for image recognition [Conference presentation]. IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA.
  • He, K., Gkioxari, G., Dollár, P., & Girshick, R. (2017, October 22–29). Mask R-CNN [Conference presentation]. IEEE International Conference on Computer Vision, Venice, Italy.
  • Huang, Y., Reddy, K. N., Fletcher, R. S., & Pennington, D. (2018). UAV low-altitude remote sensing for precision weed management. Weed Technology, 32(1), 2–6. https://doi.org/10.1017/wet.2017.89
  • International Society of Precision Agriculture. (2024). Precision agriculture definition. https://www.ispag.org/about/ definition
  • Jing, R., Niu, Q., Tian, Y., Zhang, H., Zhao, Q., Li, Z., Zhou, X., & Li, D. (2024). Sunflower-YOLO: Detection of sunflower capitula in UAV remote sensing images. European Journal of Agronomy, 160, Article 127332. https://doi.org/10.1016/j.eja.2024.127332
  • Jurišić, M., Radočaj, D., Plaščak, I., Galić Subašić, D., & Petrović, D. (2022). The evaluation of the RGB and multispectral camera on the unmanned aerial vehicle (UAV) for the machine learning classification of maize. Poljoprivreda, 28(2), 74–80. https://doi.org/10.18047/poljo.28.2.10
  • Kamilaris, A., & Prenafeta-Boldú, F. X. (2018). Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 147, 70–90. https://doi.org/10.1016/j.compag.2018.02.016
  • Karami, A., Quijano, K., & Crawford, M. (2021). Advancing tassel detection and counting: Annotation and algorithms. Remote Sensing, 13(15), Article 2881. https://doi.org/10.3390/rs13152881
  • Kerkech, M., Hafiane, A., & Canals, R. (2018). Deep learning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Computers and Electronics in Agriculture, 155, 237–243.
  • Li, J., Li, Y., Qiao, J., Li, L., Wang, X., Yao, J., & Liao, G. (2023). Automatic counting of rapeseed inflorescences using deep learning method and UAV RGB imagery. Frontiers in Plant Science, 14, Article 1101143. https://doi.org/10.3389/fpls.2023.1101143
  • Maxwell, A. E., Warner, T. A., & Guillén, L. A. (2021). Accuracy assessment in convolutional neural network-based deep learning remote sensing studies—Part 1: Literature review. Remote Sensing, 13(13), Article 2450. https://doi.org/10.3390/rs13132450
  • Padilla, R., Passos, W. L., Dias, T. L., Netto, S. L., & Da Silva, E. A. (2021). A comparative analysis of object detection metrics with a companion open-source toolkit. Electronics, 10(3), Article 279. https://doi.org/10.3390/electronics10030279
  • Poleshchenko, D., Mikhailov, I., & Petrov, V. (2023, March 29–31). On the segmentation of sunflower plants in UAV photos [Conference presentation]. 2023 25th International Conference on Digital Signal Processing and its Applications (DSPA 2023), Moscow, Russian Federation.
  • Popescu, D., Stoican, F., Stamatescu, G., Ichim, L., & Dragana, C. (2020). Advanced UAV–WSN system for intelligent monitoring in precision agriculture. Sensors, 20(3), Article 817. https://doi.org/10.3390/s20030817
  • Qureshi, S., Ameer, A., Zia, A., Latif, A., & Karrila, S. (2023). Comparing the effectiveness of classic Mask RCNN and vision transformer in early weed detection. SSRN. http://dx.doi.org/10.2139/ssrn.4436132
  • Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016, June 27-30). You only look once: Unified, real-time object detection [Conference presentation]. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.
  • Sanaeifar, A., Guindo, M. L., Bakhshipour, A., Fazayeli, H., Li, X., & Yang, C. (2023). Advancing precision agriculture: The potential of deep learning for cereal plant head detection. Computers and Electronics in Agriculture, 209, Article 107875. https://doi.org/10.1016/j.compag.2023.107875
  • Schneiter, A. A., & Miller, J. F. (1981). Description of sunflower growth stages. Crop Science, 21(6), 901–903.
  • Sefercik, U. G., Nazar, M., Aydin, I., Büyüksalih, G., Gazioglu, C., & Bayirhan, I. (2024). Comparative analyses for determining shallow water bathymetry potential of multispectral UAVs: Case study in Tavşan Island, Sea of Marmara. Frontiers in Marine Science, 11, Article 1388704. https://doi.org/10.3389/fmars.2024.1388704
  • Seiler, G. J. (1997). Anatomy and morphology of sunflower. In A. A. Schneiter (Ed.), Sunflower technology and production (pp. 67–111). American Society of Agronomy, Crop Science Society of America, Soil Science Society of America.
  • Singh, P., Pandey, P. C., Petropoulos, G. P., Pavlides, A., Srivastava, P. K., Koutsias, N., Deng, K. A. K., & Bao, Y. (2020). Hyperspectral remote sensing in precision agriculture: Present status, challenges, and future trends. In P. C. Pandey, P. K. Srivastava, H. Balzter, B. Bhattacharya, & G. P. Petropoulos (Eds.), Hyperspectral remote sensing (pp. 121–146). Elsevier.
  • Smith, L. N. (2017, March 24–31). Cyclical learning rates for training neural networks [Conference presentation]. 2017 IEEE Winter Conference on Applications of Computer Vision (WACV), Santa Rosa, CA, USA.
  • Song, Z., Wang, P., Zhang, Z., Yang, S., & Ning, J. (2023). Recognition of sunflower growth period based on deep learning from UAV remote sensing images. Precision Agriculture, 24(4), 1417–1438.
  • T.C. Çevre, Şehircilik ve İklim Değişikliği Bakanlığı. (2024). Sakarya İli 2023 Yılı Çevre Durum Raporu. https://webdosya.csb.gov.tr/db/ced/icerikler/sakarya-ilcdr-2023-1-20240614095821.pdf
  • T.C. Tarım ve Orman Bakanlığı. (2007). Ayçiçeği tarımı. https://arastirma.tarimorman.gov.tr/etae/Belgeler/ EgitimBrosur/2482012115736884.pdf
  • Tarımsal Ekonomi ve Politika Geliştirme Enstitüsü Müdürlüğü. (2023). 2023 durum tahmin raporları. https://arastirma.tarimorman.gov.tr/tepge/Belgeler/PDF%20Durum-Tahmin%20Raporları/2023%20Durum-Tahmin%20Raporları/Ayçiçeği%20Durum-Tahmin%20Raporu%202023-392%20TEPGE.pdf
  • Türkiye İstatistik Kurumu. (2023). Merkezi dağıtım sistemi. https://biruni.tuik.gov.tr/medas/?locale=tr
  • Xiao, J., Aggarwal, A. K., Rage, U. K., Katiyar, V., & Avtar, R. (2023). Deep learning-based spatiotemporal fusion of unmanned aerial vehicle and satellite reflectance images for crop monitoring. IEEE Access, 11, 85600–85614.
  • Yang, Q., Shi, L., Han, J., Zha, Y., & Zhu, P. (2019). Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Research, 235, 142–153.
  • Yang, X., Ye, Y., Li, X., Lau, R. Y., Zhang, X., & Huang, X. (2018). Hyperspectral image classification with deep learning models. IEEE Transactions on Geoscience and Remote Sensing, 56(9), 5408–5423.
  • Yildirim, E., & Kavzoglu, T. (2025). Deep convolutional neural networks for ship detection using refined DOTA and TGRS-HRRSD high-resolution image datasets. Advances in Space Research, 75(2), 1871–1887. https://doi.org/10.1016/j.asr.2024.10.028
  • Yildirim, E., Colkesen, I., & Sefercik, U. G. (2024a, July 9–10). Identification of sunflowers (Helianthus annuus L.) from multi-temporal UAV orthomosaics using deep learning models [Conference presentation]. 9th Advanced Engineering Days, Tabriz, Iran.
  • Yildirim, E., Sefercik, U. G., & Kavzoglu, T. (2024b). Automated identification of vehicles in very high-resolution UAV orthomosaics using YOLOv7 deep learning model. Turkish Journal of Electrical Engineering and Computer Sciences, 32(1), 144–165. https://doi.org/10.55730/1300-0632.4060
  • Zheng, H., Zhou, X., He, J., Yao, X., Cheng, T., Zhu, Y., Cao, W., & Tian, Y. (2020). Early season detection of rice plants using RGB, NIR-G-B and multispectral images from unmanned aerial vehicle (UAV). Computers and Electronics in Agriculture, 169, Article 105223. https://doi.org/10.1016/j.compag.2020.105223
Toplam 47 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Derin Öğrenme, Fotogrametri ve Uzaktan Algılama
Bölüm Araştırma Makaleleri
Yazarlar

Esra Yıldırım 0000-0002-4951-0488

İsmail Çölkesen 0000-0001-9670-3023

Umut Güneş Sefercik 0000-0003-2403-5956

Erken Görünüm Tarihi 25 Mart 2025
Yayımlanma Tarihi 26 Mart 2025
Gönderilme Tarihi 16 Aralık 2024
Kabul Tarihi 13 Şubat 2025
Yayımlandığı Sayı Yıl 2025 Cilt: 6 Sayı: 1

Kaynak Göster

APA Yıldırım, E., Çölkesen, İ., & Sefercik, U. G. (2025). Ayçiçeği Çiçek Tablalarının Mask R-CNN Derin Sinir Ağı ile RGB ve Multispektral İHA Ortomozaiklerinden Örnek Segmentasyonu. Türk Uzaktan Algılama Ve CBS Dergisi, 6(1), 57-72. https://doi.org/10.48123/rsgis.1602369

Creative Commons License
Turkish Journal of Remote Sensing and GIS (Türk Uzaktan Algılama ve CBS Dergisi), Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License ile lisanlanmıştır.