Research Article

Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches

Volume: 16 Number: 2 May 11, 2026
TR EN

Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches

Abstract

As the global population increases, precision agriculture plays a key role in sustainable farming. This study compares three crop row detection methods to help with autonomous navigation in agriculture: classical image processing, CNN-based plant detection, and CNN-based crop row segmentation. Each method was tested in a controlled Gazebo simulation for accuracy, speed, mean angular error, and adaptability. The CNN-based crop row segmentation method (YOLOv11n-seg) had the highest accuracy (99.5%) and was less affected by environmental changes, but it was slower, with an average speed below 5 FPS. Classical image processing was the fastest (average 95.82 FPS), but it was less reliable because it was sensitive to camera angle and color changes. CNN-based plant detection, especially YOLOv11n, provided a good balance of high accuracy (98.79%), real-time speed (32.3 FPS on Jetson Orin Nano), and robustness, and it performed better than MobileNetV2 (94.62%, 21.49 FPS). The study also used mean angular error to measure navigation stability. CNN-based methods, especially YOLOv11n, had lower angular errors (±1.40°) and were more stable than classical methods (±9.82°), which led to more reliable simulated results. Other recent studies have also compared the efficiency of these methods. The findings highlight a trade-off between real-time performance and accuracy. Field trials are planned to test these results in real-world conditions.

Keywords

Crop row detection, Autonomous navigation, Deep learning, Mean angular error

Supporting Institution

Ondokuz Mayıs University

Project Number

BAP08-2024-5409

Ethical Statement

This study does not involve human or animal subjects and therefore does not require ethical approval.

Thanks

This research was supported by Ondokuz Mayıs University Scientific Research Projects Coordination Unit under the project number BAP08-2024-5409. We thank the university for its financial support.

References

  1. Ahmadi, A., Nardi, L., Chebrolu, N., & Stachniss, C. (2020). Visual servoing-based navigation for monitoring row-crop fields. 2020 IEEE International Conference on Robotics and Automation (ICRA), 4920–4926. https://doi.org/10.1109/ICRA40945.2020.9197114
  2. Alqahtani, D. K., Cheema, M. A., & Toosi, A. N. (2025). Benchmarking deep learning models for object detection on edge computing devices. In W. Gaaloul, Q. Yu, M. Sheng, & S. Yangui (Eds.), Service-Oriented Computing: ICSOC 2024 (Lecture Notes in Computer Science, Vol. 15404, pp. 142–150). Springer. https://doi.org/10.1007/978-981-96-0805-8_11
  3. Bah, M. D., Hafiane, A., & Canals, R. (2020). CRowNet: Deep network for crop row detection in UAV images. IEEE Access, 8, 5189–5200. https://doi.org/10.1109/ACCESS.2019.2960873
  4. Balakirsky, S., & Kootbally, Z. (2012). USARSim/ROS: A combined framework for robotic control and simulation. Proceedings of the ASME/ISCIE 2012 International Symposium on Flexible Automation (ISFA 2012), 101–108. https://doi.org/10.1115/ISFA2012-7179
  5. Bengochea-Guevara, J. M., Conesa-Muñoz, J., Andújar, D., & Ribeiro, A. (2016). Merge fuzzy visual servoing and GPS-based planning to obtain a proper navigation behavior for a small crop-inspection robot. Sensors, 16(3), Article 276. https://doi.org/10.3390/s16030276
  6. Bonadies, S., & Gadsden, S. A. (2019). An overview of autonomous crop row navigation strategies for unmanned ground vehicles. Engineering in Agriculture, Environment and Food, 12(1), 24–31. https://doi.org/10.1016/j.eaef.2018.10.002
  7. Cheppally, R. H., & Sharda, A. (2025). RowDetr: End-to-end crop row detection using polynomials. Smart Agricultural Technology, 7, Article 101494. https://arxiv.org/abs/2412.10525
  8. De Silva, R., Cielniak, G., & Gao, J. (2024). Vision-based crop row navigation under varying field conditions in arable fields. Computers and Electronics in Agriculture, 217, Article 108581. https://doi.org/10.1016/j.compag.2023.108581
  9. De Silva, R., Cielniak, G., Wang, G., & Gao, J. (2023). Deep learning-based crop row detection for infield navigation of agri-robots. Journal of Field Robotics, 40(8), 2299–2321. https://doi.org/10.1002/rob.22238
  10. Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., & Fei-Fei, L. (2009). ImageNet: A large-scale hierarchical image database. 2009 IEEE Conference on Computer Vision and Pattern Recognition, 248–255. https://doi.org/10.1109/CVPR.2009.5206848
APA
Atchogou, A., & Tepe, C. (2026). Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches. Karadeniz Fen Bilimleri Dergisi, 16(2), 605-625. https://doi.org/10.31466/kfbd.1714811
AMA
1.Atchogou A, Tepe C. Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches. KFBD. 2026;16(2):605-625. doi:10.31466/kfbd.1714811
Chicago
Atchogou, Anselme, and Cengiz Tepe. 2026. “Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches”. Karadeniz Fen Bilimleri Dergisi 16 (2): 605-25. https://doi.org/10.31466/kfbd.1714811.
EndNote
Atchogou A, Tepe C (May 1, 2026) Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches. Karadeniz Fen Bilimleri Dergisi 16 2 605–625.
IEEE
[1]A. Atchogou and C. Tepe, “Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches”, KFBD, vol. 16, no. 2, pp. 605–625, May 2026, doi: 10.31466/kfbd.1714811.
ISNAD
Atchogou, Anselme - Tepe, Cengiz. “Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches”. Karadeniz Fen Bilimleri Dergisi 16/2 (May 1, 2026): 605-625. https://doi.org/10.31466/kfbd.1714811.
JAMA
1.Atchogou A, Tepe C. Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches. KFBD. 2026;16:605–625.
MLA
Atchogou, Anselme, and Cengiz Tepe. “Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches”. Karadeniz Fen Bilimleri Dergisi, vol. 16, no. 2, May 2026, pp. 605-2, doi:10.31466/kfbd.1714811.
Vancouver
1.Anselme Atchogou, Cengiz Tepe. Enhanced Crop Row Detection Techniques for Autonomous Agricultural Vehicle: A Comparative Analysis of Classical and Deep Learning Approaches. KFBD. 2026 May 1;16(2):605-2. doi:10.31466/kfbd.1714811