Research Article
BibTex RIS Cite

Identifying Agricultural Crops with Similar Spectral Properties Using Machine Learning Classifiers and SenseFly eBee SQ Multispectral UAV Images

Year 2026, Volume: 32 Issue: 1, 93 - 111, 20.01.2026
https://doi.org/10.15832/ankutbd.1639091

Abstract

This study investigates the use of vegetation indices for accurately identifying crops with similar spectral characteristics as grapes, apricots, tomatoes, wheat, and clover for enhancing crop monitoring and management. A 59-hectare area in Karakaya Village, located in the Üzümlü district of Erzincan Province, Türkiye, was selected as the study area. This area contains crops with varying textures, object height, and spectral characteristics. In the study, multispectral (MS) images were acquired using the SenseFly eBee SQ unmanned aerial vehicle (UAV), and subsequently processed to generate an orthophoto, digital terrain model (DTM), and digital surface model (DSM). Fifteen vegetation indices, Gabor texture features, and object heights were integrated into MS bands. Crop classification was performed using two high-accuracy machine learning algorithms: Random Forest (RF) and Support Vector Machine (SVM). According to the overall classification accuracy results, the use of vegetation indices improved classification accuracy by 9% for RF and 5% for SVM. Incorporating Gabor texture features with the topperforming indices (MACARI1, OSAVI, ADVI, and DVI) further increased accuracy to 20% for RF and 12% for SVM. Additionally, including object height alongside the indices and Gabor features resulted in further accuracy gainsof 10% and 11% for RF and SVM, respectively. F1-score, specificity, and accuracy analyses, along with various kappa statistics, also the significant improvements in classification performance.
According to the McNemar test, the χ^2 values comparing orthophoto images with those incorporating indices, texture, and object height ranged from 6.353 to 35.556 for RF, and from 7.220 to 11.021 for SVM. Since all χ^2 values exceeded 3.84, the results indicate statistically significant improvements in the classification accuracy at the 95% confidence interval.

References

  • Akar A (2022). Improving the accuracy of random forest-based land-use classification using fused images and digital surface models produced via different interpolation methods. Concurrency Computat Pract Exper 34(6): e6787. https://doi.org/10.1002/cpe.6787
  • Akar O & Tunc Gormus E (2021). Land use/land cover mapping from airborne hyperspectral images with machine learning algorithms and contextual information. Geocarto International 37(14): 3963–3990. https://doi.org/10.1080/10106049.2021.1945149
  • Akar Ö, Saralıoğlu E, Güngör O & Bayata H F (2021). Determination of vineyards with support vector machine and deep learning-based image classification. 17 November 3rd Intercontinental Geoinformation Days (IGD), Mersin, Turkey
  • Akar Ö & Güngör O (2015). Integrating multiple texture methods and NDVI to the random forest classification algorithm to detect tea and hazelnut plantation areas in northeast Turkey. Int J Remote Sens 36(2): 442–464. https://doi.org/10.1080/01431161.2014.995276
  • Akar Ö & Güngör O (2012). Classification of multispectral images using Random Forest algorithm. Journal of Geodesy and Geoinformation 1(2): 105–112
  • Alzhanov A & Nugumanova A (2024). Crop classification using UAV multispectral images with gray-level co-occurrence matrix features. Procedia Comput. Sci. 231: 734-739. https://doi.org/10.1016/j.procs.2023.12.145
  • Amini S, Saber M, Rabiei-Dastjerdi H & Homayouni S (2022). Urban land use and land cover change analysis using random forest classification of landsat time series. Remote Sensing 14(11): 2654. https://doi.org/10.3390/rs14112654
  • Archer K J & Kimes R V (2008). Empirical characterization of random forest variable importance measures. Computational statistics & data analysis 52(4): 2249-2260.https://doi.org/10.1016/j.csda.2007.08.015
  • Atik Ş Ö (2025). Classification of Urban Vegetation Utilizing Spectral Indices and DEM with Ensemble Machine Learning Methods. International Journal of Environment and Geoinformatics 12(1): 43-53. https://doi.org/10.26650/ijegeo. 1640878
  • Awad M & Khanna R (2015). Support vector machines for classification. in: efficient learning machines. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4302-5990-9_3
  • Bareth G & Hütt C (2023). Evaluation of Direct RTK-georeferenced UAV Images for crop and pasture monitoring using polygon grids. PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science 91: 471–483. https://doi.org/10.1007/s41064-023-00259-7
  • Bendig J, Bolten A, Bennertz S, Broscheit J, Eichhorn K & Bareth G (2014). Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sensing 6(11):10395-10412. https://doi.org/10.3390/rs61110395
  • Breiman L (1999). Random forests—random features. Technical Report 567, Statistics Department, University of California, Berkeley. Retrived March 10, 2024. https://www.stat.berkeley.edu/~breiman/random-forests.pdf
  • Breiman L (2001). Random Forests, Machine learning, Kluwer Academic Publishers 45(1): 5-32
  • Breiman L (2002). Manual on Setting Up, Using, and Understanding Random Forests V3.1. Retrived June 17, 2024. https://www.stat.berkeley.edu/~breiman/Using_random_forests_V3.1.pdf
  • Broge N H & Leblanc E (2001). Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density, Remote Sens. Environ 7: 156–172. https://doi.org/10.1016/S0034-4257(00)00197-8
  • Chen J M (1996). Evaluation of vegetation indices and a modified simple ratio for Boreal applications, Can. J. Remote Sens. 22: 229–242. https://doi.org/10.1080/07038992.1996.10855178
  • Congalton R G & Green K (1999). Assessing the accuracy of remotely sensed data: principles and practices. Boca Raton: Lewis Publishers Congalton R G (2001). Accuracy assessment and validation of remotely sensed and other spatial information. Int J Wildland Fire 10: 321 328.10.1071/WF01031
  • Datt B (1999). VIible/near infrared reflectance and chlorophyll concentration in eucalyptus leaves. Int. J. Remote Sens 20: 2741–2759. https://doi.org/10.1080/014311699211778
  • Daughtry C S, Walthall C, Kim M, de Colstoun E B & McMurtrey J (2000). Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ 74: 229–239. https://doi.org/10.1016/S0034-4257(00)00113-9
  • Debnath L (2002). Wavelet Transforms and Their Applications, Springer science+, Busines media LLC, Newyork, 2002 edition, ISBN 10: 0817642048.
  • Deng H, Zhang W, Zheng X & Zhang H (2024). Crop classification combining object-oriented method and random forest model using unmanned aerial vehicle (UAV) multispectral image. Agriculture 14(4): 548. https://doi.org/10.3390/agriculture14040548
  • Devi R & Kaur H (2023). Automated approach to classification of crops using SVM and neural network. Journal of Data Acquisition and Processing 38(3): 2092-2106. DOI: 10.5281/zenodo.98549465
  • Domingues T, Brandão T & Ferreira J C (2022). Machine learning for detection and prediction of crop diseases and pests: A comprehensive survey. Agriculture 12(9):1350. https://doi.org/10.3390/agriculture12091350
  • dos Santos D S, Ribeiro P G, Andrade R, Silva S H G, Gastauer M, Caldeira C F, Guedes R S, Dias Y N, Filho P W M S & Ramos S J (2024). Clean and accurate soil quality monitoring in mining areas under environmental rehabilitation in the Eastern Brazilian Amazon. Environ Monit Assess 196: 385. https://doi.org/10.1007/s10661-024-12495-4
  • Ehammer A, Fritsch S, Conrad C, Lamers J & Dech S (2010). Statistical derivation of fPAR and LAI for irrigated cotton and rice in arid Uzbekistan by combining multi-temporal RapidEye data and ground measurements. Proc.SPIE 7824. https://doi.org/10.1117/12.864796
  • Faqe Ibrahim G R, Rasul A & Abdullah H (2023). Improving crop classification accuracy with integrated Sentinel-1 and Sentinel-2 data: A
  • case study of barley and wheat. Journal of Geovisualization and Spatial Analysis 7: 22. https://doi.org/10.1007/s41651-023-00152-2
  • Feng A, Zhou J, Vories E D, Sudduth K A & Zhang M (2020). Yield estimation in cotton using UAV-based multi-sensor imagery. Biosystems Engineering 193: 101-114. https://doi.org/10.1016/j.biosystemseng.2020.02.014
  • Foody G M (2004). Thematic map comparison. Photogramm Eng Remote Sensing 70(5): 627–633. https://doi.org/10.14358/PERS.70.5.627
  • Gatera A, Kuradusenge M, Bajpai G, Mikeka C & Shrivastava S (2023). Comparison of random forest and support vector machine regression models for forecasting road accidents. Sci African 21, e01739. https://doi.org/10.1016/j.sciaf.2023.e01739
  • Gitelson A A & Merzlyak M N (1997). Remote estimation of chlorophyll content in higher plant leaves. International Journal of Remote Sensing 18(12): 2691–2697. https://doi.org/10.1080/014311697217558
  • Haboudane D, Miller J R, Pattey E, Zarco-Tejada P J & Strachan I B (2004). Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sensing of Environment 90(3): 337-352. https://doi.org/10.1016/j.rse.2003.12.01
  • Hamamoto Y, Uchimura S, Watanabe M, Yasuda T, Mitani Y & Tomita S (1998). A Gabor filter based method for recognizing handwritten numerals. Pattern Recognition 31(4): 395–400. https://doi.org/10.1016/S0031-3203(97)00057-5
  • Hayli S (2002). Erzincan ovasında tarımın başlıca özellikleri. Journal of Social Science 1: 1-30
  • He S, Peng P, Chen Y & Wang X (2022). Multi-crop classification using feature selection-coupled machine learning classifiers based on spectral, textural and environmental features. Remote Sensing 14(13): 3153. https://doi.org/10.3390/rs14133153
  • Huete A, Didan K, Miura T, Rodriguez E, Gao X & Ferreira L (2002). Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ 83: 195–213. https://doi.org/10.1016/S0034-4257(02)00096-2
  • Iranzad R & Liu X (2024). A review of random forest-based feature selection methods for data science education and applications. Int J Data Sci and Anal. https://doi.org/10.1007/s41060-024-00509-w
  • Jordan C F (1969). Derivation of leaf-area index from quality of light on the forest floor, Ecology 50: 663–666 Kavitha S S & Kaulgud N (2024). Quantum machine learning for support vector machine classification. Evol. Intel. 17: 819–828. https://doi.org/10.1007/s12065-022-00756-5
  • Kross A, McNairn H, Lapen D, Sunohara M & Champagne C (2015). Assessment of RapidEye vegetation indices for estimation of leaf area index and biomass in corn and soybean crops. International Journal of Applied Earth Observation and Geoinformation 34: 235-248. https://doi.org/10.1016/j.jag.2014.08.002
  • Kumar P, Prasad R, Choudhary A, Mishra V N, Gupta D K & Srivastava P K (2016). A statistical significance of differences in classification accuracy of crop types using different classification algorithms. Geocarto International 32(2): 206–224. https://doi.org/10.1080/10106049.2015.1132483
  • Kwak G H & Park N W (2019). Impact of texture information on crop classification with machine learning and UAV images. Applied Sciences 9(4): 643. https://doi.org/10.3390/app9040643
  • Lee W S, Alchanatis V, Yang C, Hirafuji M, Moshou D & Li C (2010). Sensing technologies for precision specialty crop production. Comput Electron Agric 74(2): 33. https://doi.org/10.1016/j.compag.2010.08.005
  • Li M, Zang S, Zhang B, Li S & Wu C (2014). A review of remote sensing image classification techniques: The role of spatio-contextual information. European Journal of Remote Sensing 47(1): 389-411. https://doi.org/10.5721/EuJRS20144723
  • Lillesand T M, Kiefer R W & Chipman J W (2004). Remote sensing and image interpretation. Wiley, United States of America, pp. 552-572.
  • Lucena F, Breunig F M & Kux H (2022). The combined use of UAV-based RGB and DEM images for the detection and delineation of orange tree crowns with Mask R-CNN: An approach of labeling and unified framework. Future Internet 14(10): 275. https://doi.org/10.3390/fi14100275
  • Maes W H & Steppe K (2019). Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends in Plant Science 24(2): 152-164. https://doi.org/10.1016/j.tplants.2018.11.007
  • Manzali Y & Elfar M (2023). Random forest pruning techniques: A recent review. Oper. Res. Forum 4: 43. https://doi.org/10.1007/s43069 023-00223-6
  • features derived Marcone A, Impollonia G, Croci M, Blandinières H, Pellegrini N & Amaducci S (2024). Garlic yield monitoring using vegetation indices and texture from UAV multispectral imagery. Smart Agricultural Technology 8: 100513. https://doi.org/10.1016/j.atech.2024.100513
  • Mather P M & Tso B (2009). Classification methods for remotely sensed data, Second Editon, CRC Press, United States of America.
  • Mather P M (2004). Computer Processing of Remotely-Sensed Images: An Introduction, Third edition, Wiley, USA, ISBN 0-470-84918-5.
  • Moreno-Revelo M Y, Guachi-Guachi L, Gómez-Mendoza J B, Revelo-Fuelagán J & Peluffo-Ordóñez D H (2021). Enhanced convolutional neural-network architecture for crop classification. Applied Sciences 11(9): 4292. https://doi.org/10.3390/app11094292
  • Nidamanuri R R (2024). Deep learning-based prediction of plant height and crown area of vegetable crops using LiDAR point cloud. Scientific Reports 14(1): 14903. https://doi.org/10.1038/s41598-024-65322-8
  • Pal M (2005). Random forest classifier for remote sensing classification. International Journal of Remote Sensing 26(1): 217-222. https://doi.org/10.1080/01431160412331269698
  • Parashar D, Kumar A, Palni S, Pandey A, Singh A & Singh A P (2024). Use of machine learning-based classification algorithms in the monitoring of land use and land cover practices in a hilly terrain. Environ Monit Assess 196: 8. https://doi.org/10.1007/s10661-023-12131-7
  • Pearson R L & Miller L D (1972). Remote mapping of standing crop biomass for estimation of the productivity of the shortgrass prairie, Pawnee National Grasslands, Colorado. In Proceedings of the Eighth International Symposium on Remote Sensing of Environment Dept (pp. 1357-1381). Fort Collins, Colorado.
  • Pei S, Dai Y, Bai Z, Li Z, Zhang F, Yin F & Fan J (2024). Improved estimation of canopy water status in cotton using vegetation indices along with textural information from UAV-based multispectral images. Computers and Electronics in Agriculture 224, 109176. https://doi.org/10.1016/j.compag.2024.109176
  • Peña-Barragán J M, Ngugi M K, Plant R E & J Six J (2011). Object-based crop identification using multiple vegetation indices, textural features and crop phenology. Remote Sens. Environ 115(6). https://doi.org/10.1016/j.rse.2011.01.009
  • Penuelas J, Baret F, & Filella I (1995). Semi-empirical indices to assess carotenoids/chlorophyll-a ratio from leaf spectral reflectance. Photosynthetica 31(2): 221–230.
  • Petkov N & Wieling M B (2008). Gabor filter for image processing and computer vision, University of Groningen, Department of Computing Science, Intelligent Systems, Retrieved March, 12, 2024. http://matlabserver.cs.rug.nl/edgedetectionweb/web/ edgedetection_params.html
  • Pipatsitee P, Tisarum R, Taota K, Samphumphuang T, Eiumnoh A, Singh H P & Cha-um S (2023). Effectiveness of vegetation indices and UAV-multispectral imageries in assessing the response of hybrid maize (Zea mays L.) to water deficit stress under field environment. Environ Monit Assess 195, 128. https://doi.org/10.1007/s10661-022-10766-6 Pontius R G & Millones M (2011). Death to kappa: birth of quantity disagreement and allocation disagreement for accuracy assessment. International Journal of Remote Sensing 32: 4407-4429. https://doi.org/10.1080/01431161.2011.552923
  • Potgieter A B, Zhao Y, Zarco-Tejada P J, Chenu K, Zhang Y, Porker K, Biddulph B, Dang Y P, Neale T, Roosta F & Chapman S (2021). Evolution and application of digital technologies to predict crop type and crop phenology in agriculture. In Silico Plants 3(1). https://doi.org/10.1093/insilicoplants/diab017
  • Pun Magar L, Sandifer J, Khatri D, Poudel S, Kc S, Gyawali B & Chiluwal A (2025). Plant height measurement using UAV-based aerial RGB and LiDAR images in soybean. Frontiers in Plant Scienc, 16, 1488760. https://doi.org/10.3389/fpls.2025.1488760
  • Qi J, Chehbouni A, Huete A R, Kerr Y H & Sorooshian S (1994). A modified soil adjusted vegetation index. Remote Sens. Environ 48: 119 126. https://doi.org/10.1016/0034-4257(94)90134-1
  • Rissanen K, Martin-Guay M O, Riopel-Bouvier A S & Paquette A (2019). Light interception in experimental forests affected by tree diversity and structural classifier complexity of dominant canopy. Agricultural and Forest Meteorology 278: 107655. https://doi.org/10.1016/j.agrformet.2019.107655
  • Rodriguez-Galiano V F, Ghimire B, Rogan J, Chica-Olmo M & Rigol-Sanchez J P (2012). An assessment of the effectiveness of a random forest for land-cover classification. ISPRS Journal of Photogrammetry and Remote Sensing 67: 93–104. https://doi.org/10.1016/j.isprsjprs.2011.11.002
  • Rondeaux G, Steven M & Baret F (1996). Optimization of soil-adjusted vegetation indices, Remote Sens. Environ. 55: 95–107. https://doi.org/10.1016/0034-4257(95)00186-7
  • Sah S S, Maulud K N A, Sharil S, Karim O A & Pradhan B (2023). Monitoring of three stages of paddy growth using multispectral vegetation index derived from UAV images. The Egyptian Journal of Remote Sensing and Space Sciences 26(4): 989-998. https://doi.org/10.1016/j.ejrs.2023.11.005
  • Saleem M H, Potgieter J & Arif K M (2021). Automation in agriculture by machine and deep learning techniques: a review of recent developments. Precision Agric 22: 2053–2091. https://doi.org/10.1007/s11119-021-09806-x
  • SenseFly (2022). eBee SQ the advanced agricultural drone. Retrieved July 2024. https://srv.jgc.gr/Pdf_files/eBee-SQ-en.pdf
  • Shahi T B, Xu C Y, Neupane A & Guo W (2022). Machine learning methods for precision agriculture with UAV imagery: A review. Electron. Res. Arch 30(12): 4277–4317. doi: 10.3934/era.2022218
  • Sims D A & Gamon J A (2002). Relationships between leaf pigment concentration and spectral reflectance across a wide range of species, leaf structures and developmental stages. Remote Sens. Environ 81: 337–354 https://doi.org/10.1016/S0034-4257(02)00010-X
  • Singh R & Kumar V (2023). Evaluating automated endmember extraction for classifying hyperspectral data and deriving spectral parameters for monitoring forest vegetation health. Environ Monit Assess 195: 72. https://doi.org/10.1007/s10661-022-10576-w
  • Song B & Park K (2020). Detection of aquatic plants using multispectral UAV imagery and vegetation index. Remote Sensing 12(3): 387. https://doi.org/10.3390/rs12030387
  • Teimouri M, Mokhtarzade M, Baghdadi N & Heipke C (2023). Generating virtual training labels for crop classification from fused Sentinel-1 and Sentinel-2 time series. PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science 91: 413–423. https://doi.org/10.1007/s41064-023-00256-w
  • Tetteh G O, Schwieder M, Erasmi S, Conrad C & Gocth A (2023). Comparison of an optimised multiresolution segmentation approach with deep neural networks for delineating agricultural fields from Sentinel-2 images. PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science 91: 295–312. https://doi.org/10.1007/s41064-023-00247-x
  • TR Erzincan Governorate (2024). http://www.erzincan.gov.tr/erzincan-uzumu (Retrieved March, 12, 2024).
  • Tzotsos A & Argialas D (2008). Support vector machine classification for object-based image analysis. In: Blaschke, T., Lang, S., Hay, G.J. (eds) Object-Based Image Analysis. Lecture Notes in Geoinformation and Cartography. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-77058-9_36
  • Vapnik V N (1998). Statistical Learning Theory. New York: Wiley. Wan L, Li Y, Cen H, Zhu J, Yin W, Wu W, Zhu H, Sun D, Zhou W & He Y (2018). Combining UAV-based vegetation indices and image classification to estimate flower number in oilseed rape. Remote Sensing 10(9):1484. https://doi.org/10.3390/rs10091484
  • Watts J D, Powell S L, Lawrence R L & Hilker T (2011). Improved classification of conservation tillage adoption using high temporal and synthetic satellite imagery. Remote Sensing of Environment 115(1): 66–75. https://doi.org/10.1016/j.rse.2010.08.005
  • Wu Q, Zhang Y, Zhao Z, Xie M & Hou D (2023). Estimation of relative chlorophyll content in spring wheat based on multi-temporal UAV remote sensing. Agronomy 13(1): 211. https://doi.org/10.3390/agronomy13010211
  • Xu Q, Jin M & Guo P (2022). A high-precision crop classification method based on time-series UAV images. Agriculture 13(1): 97. https://doi.org/10.3390/agriculture13010097
  • Yao L, Shono Y, Nowinski C, Dworak E M, Kaat A, Chen S, Lovett T, Ho E, Curtis L, Wolf M, Gershon R & Benavente J Y (2024). Prediction of cognitive impairment using higher order item response theory and machine learning models. Frontiers in psychiatry 14, 1297952. https://doi.org/10.3389/fpsyt.2023.1297952
  • Yi Z, Jia L & Chen Q (2020). Crop classification using multi-temporal Sentinel-2 data in the Shiyang river basin of China. Remote Sensing 12(24): 4052. https://doi.org/10.3390/rs12244052
  • Zermane A, Mohd Tohir M Z, Zermane H, Baharudin M R & Mohamed Yusoff H (2023). Predicting fatal fall from heights accidents using random forest classification machine learning model. Safety Science 159, Article 106023. https://doi.org/10.1016/j.ssci.2022.106023
  • Zhang J, Pan Y, Tao X, Wang B, Cao Q, Tian Y, Zhu Y, Cao W & Liu X (2023). In-season mapping of rice yield potential at jointing stage using Sentinel-2 images integrated with high-precision UAS data. European Journal of Agronomy 146, 126808. https://doi.org/10.1016/j.eja.2023.126808
  • Zhang J, Qiu X, Wu Y, Zhu Y, Cao Q, Liu X & Cao W (2021b). Combining texture, color, and vegetation indices from fixed-wing UAS imagery to estimate wheat growth parameters using multivariate regression methods. Computers and Electronics in Agriculture 185, 106138. https://doi.org/10.1016/j.compag.2021.106138
  • Zhang X, Sun Y, Shang K, Zhang L & S Wang (2016). Crop classification based on feature band set construction and object-oriented approach using hyperspectral images. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 9(9): 4117-4128. doi: 10.1109/JSTARS.2016.2577339. in heterogeneous
  • Zhang H, Wang Y, Shang J, Liu M & Li Q (2021a). Investigating the impact of classification features and classifiers on crop mapping performance agricultural LAI landscapes ınt. J. Appl. Earth Obs. Geoinf 102, 102388. https://doi.org/10.1016/j.jag.2021.102388
  • Zou M, Liu Y, Fu M, Li C, Zhou Z, Meng, H & Ren Y (2024). Combining spectral and texture feature of UAV image with plant height to improve estimation of winter https://doi.org/10.3389/fpls.2023.1272049
There are 90 citations in total.

Details

Primary Language English
Subjects Geomatic Engineering (Other)
Journal Section Research Article
Authors

Özlem Akar 0000-0001-6381-4907

Alper Akar 0000-0003-4284-5928

Halim Ferit Bayata 0000-0001-8274-8888

Submission Date February 13, 2025
Acceptance Date July 22, 2025
Publication Date January 20, 2026
Published in Issue Year 2026 Volume: 32 Issue: 1

Cite

APA Akar, Ö., Akar, A., & Bayata, H. F. (2026). Identifying Agricultural Crops with Similar Spectral Properties Using Machine Learning Classifiers and SenseFly eBee SQ Multispectral UAV Images. Journal of Agricultural Sciences, 32(1), 93-111. https://doi.org/10.15832/ankutbd.1639091
AMA Akar Ö, Akar A, Bayata HF. Identifying Agricultural Crops with Similar Spectral Properties Using Machine Learning Classifiers and SenseFly eBee SQ Multispectral UAV Images. J Agr Sci-Tarim Bili. January 2026;32(1):93-111. doi:10.15832/ankutbd.1639091
Chicago Akar, Özlem, Alper Akar, and Halim Ferit Bayata. “Identifying Agricultural Crops With Similar Spectral Properties Using Machine Learning Classifiers and SenseFly EBee SQ Multispectral UAV Images”. Journal of Agricultural Sciences 32, no. 1 (January 2026): 93-111. https://doi.org/10.15832/ankutbd.1639091.
EndNote Akar Ö, Akar A, Bayata HF (January 1, 2026) Identifying Agricultural Crops with Similar Spectral Properties Using Machine Learning Classifiers and SenseFly eBee SQ Multispectral UAV Images. Journal of Agricultural Sciences 32 1 93–111.
IEEE Ö. Akar, A. Akar, and H. F. Bayata, “Identifying Agricultural Crops with Similar Spectral Properties Using Machine Learning Classifiers and SenseFly eBee SQ Multispectral UAV Images”, J Agr Sci-Tarim Bili, vol. 32, no. 1, pp. 93–111, 2026, doi: 10.15832/ankutbd.1639091.
ISNAD Akar, Özlem et al. “Identifying Agricultural Crops With Similar Spectral Properties Using Machine Learning Classifiers and SenseFly EBee SQ Multispectral UAV Images”. Journal of Agricultural Sciences 32/1 (January2026), 93-111. https://doi.org/10.15832/ankutbd.1639091.
JAMA Akar Ö, Akar A, Bayata HF. Identifying Agricultural Crops with Similar Spectral Properties Using Machine Learning Classifiers and SenseFly eBee SQ Multispectral UAV Images. J Agr Sci-Tarim Bili. 2026;32:93–111.
MLA Akar, Özlem et al. “Identifying Agricultural Crops With Similar Spectral Properties Using Machine Learning Classifiers and SenseFly EBee SQ Multispectral UAV Images”. Journal of Agricultural Sciences, vol. 32, no. 1, 2026, pp. 93-111, doi:10.15832/ankutbd.1639091.
Vancouver Akar Ö, Akar A, Bayata HF. Identifying Agricultural Crops with Similar Spectral Properties Using Machine Learning Classifiers and SenseFly eBee SQ Multispectral UAV Images. J Agr Sci-Tarim Bili. 2026;32(1):93-111.

Journal of Agricultural Sciences is published as open access journal. All articles are published under the terms of the Creative Commons Attribution License (CC BY).