Research Article
BibTex RIS Cite

Use of YOLOv5 Trained Model for Robotic Courgette Harvesting and Efficiency Analysis

Year 2024, Volume: 34 Issue: 4, 669 - 689
https://doi.org/10.29133/yyutbd.1517109

Abstract

The utilization of machine learning in vegetable harvesting not only enhances efficiency and precision but also addresses labor shortages and improves overall agricultural productivity. In this study, a machine learning method was developed for harvesting courgette fruit. Courgette is a fruit that can take a long time to select and harvest in the agricultural area where it is grown. The YOLOv5 models (nano, small, medium, and large) were used as a deep learning method. All metric values of the models were analyzed. The most successful model was the one trained with the YOLOv5m algorithm using 20 batches and 160 epochs with 640x640 images. The results of the model scores were analyzed as "metrics/precision", "metrics/recall", "metrics/mAP_0.5" and "metrics/mAP_0.5: 0.95". These metrics are key indicators that measure the recognition success of a model and reflect the performance of the respective model on the validation dataset. The metrics data of the "YOLOv5 medium" model proved to be higher compared to the other models. The measured values were YOLOv5m = size: 640x640, batch: 20, epoch: 160, algorithm: YOLOv5m. It was concluded that "YOLOv5m" is the best recognition model that can be used in robotic courgette harvesting to separate the courgette from the branch.

References

  • Alam, M. S., Alam, M., Tufail, M., Khan, M. U., Güneş, A., Salah, B., Nasir, F. E., Saleem, W., & Khan, M. T. (2022). TobSet: A new tobacco crop and weeds image dataset and its utilization for vision-based spraying by agricultural robots. Applied Sciences, 12(3), 1308. https://doi.org/10.3390/app12031308
  • Arad, B., Balendonck, J., Barth, R., Ben‐Shahar, O., Edan, Y., Hellström, T., Hemming, J., Kurtser, P., Ringdahl, O., & Tielen, T. (2020). Development of a sweet pepper harvesting robot. Journal of Field Robotics, 37(6), 1027-1039. https://doi.org/10.1002/rob.21937
  • Atalay, M., & Çelik, E. (2017). Artificial intelligence and machine learning applications in big data analysis. Mehmet Akif Ersoy University Social Sciences Institute Journal, 9(22), 155-172. https://doi.org/https://doi.org/10.20875/makusobed.309727
  • Altınbilek, H. F., & Kızıl, Ü. (2022). Identification of Paddy Rice Diseases Using Deep Convolutional Neural Networks. Yuzuncu Yıl University Journal of Agricultural Sciences, 32(4), 705-713. https://doi.org/10.29133/yyutbd.1140911
  • Bai, W., Zhao, J., Dai, C., Zhang, H., Zhao, L., Ji, Z., & Ganchev, I. (2023). Two novel models for traffic sign detection based on YOLOv5s. Axioms, 12(2), 160. https://doi.org/10.3390/axioms12020160
  • Barman, U., Das, D., Sonowal, G., Dutta, M. (2024). Innovative Approaches to Rice (Oryza sativa) Crop Health: A Comprehensive Analysis of Deep Transfer Learning for Early Disease Detection. Yuzuncu Yıl University Journal of Agricultural Sciences, 34(2), 314-322. https://doi.org/10.29133/yyutbd.1402821
  • Bati, C. T., & Ser, G. (2023). Effects of Data Augmentation Methods on YOLO v5s: Application of Deep Learning with Pytorch for Individual Cattle Identification. Yuzuncu Yıl University Journal of Agricultural Sciences, 33(3), 363-376. https://doi.org/10.29133/yyutbd.1246901
  • Chen, W., Lu, S., Liu, B., Chen, M., Li, G., & Qian, T. (2022). CitrusYOLO: A algorithm for citrus detection under orchard environment based on YOLOv4. Multimedia Tools and Applications, 81, 31363–31389.https://doi.org/10.1007/s11042-022-12687-5
  • Darwin, B., Dharmaraj, P., Prince, S., Popescu, D. E., & Hemanth, D. J. (2021). Recognition of bloom/yield in crop images using deep learning models for smart agriculture: A review. Agronomy, 11(4), 646. https://doi.org/10.3390/agronomy11040646
  • Deng, L., & Yu, D. (2014). Deep learning: methods and applications. Foundations and Trends® in Signal Processing, 7(3–4), 197-387.https://doi.org/10.1561/2000000039
  • Droukas, L., Doulgeri, Z., Tsakiridis, N. L., Triantafyllou, D., Kleitsiotis, I., Mariolis, I., Giakoumis, D., Tzovaras, D., Kateris, D., & Bochtis, D. (2023). A Survey of Robotic Harvesting Systems and Enabling Technologies. Journal of Intelligent & Robotic Systems, 107(2), 21.https://doi.org/10.1007/s10846-022-01793-z
  • Du, F. J., & Jiao, S. J. (2022). Improvement of lightweight convolutional neural network model based on YOLO algorithm and its research in pavement defect detection. Sensors, 22(9), 3537. https://doi.org/10.3390/s22093537
  • Elavarasan, D., & Vincent, P. D. (2020). Crop yield prediction using deep reinforcement learning model for sustainable agrarian applications. IEEE Access, 8, 86886-86901. https://doi.org/10.1109/access.2020.2992480
  • Fountas, S., Mylonas, N., Malounas, I., Rodias, E., Hellmann Santos, C., & Pekkeriet, E. (2020). Agricultural robotics for field operations. Sensors, 20(9), 2672. https://doi.org/10.3390/s20092672
  • Gholipoor, M., & Fathollah, N. (2019). Fruit yield prediction of pepper using artificial neural network. Scientia Horticulturae, 250, 249-253.https://doi.org/10.1016/j.scienta.2019.02.040
  • Hong, W., Ma, Z., Ye, B., Yu, G., Tang, T., & Zheng, M. (2023). Detection of green asparagus in complex environments based on the improved YOLOv5 algorithm. Sensors, 23(3), 1562. https://doi.org/10.3390/s23031562
  • İmak, A., Doğan, G., Şengür, A., & Ergen, B. (2023). A new method based on extracting, combining and selecting deep features from natural and synthetic data for classification of grapevine leaf species. Int. J. Pure App. Sci., 9(1), 46-55. https://doi.org/10.29132/ijpas.1144615
  • Jaramillo-Hernández, J. F., Julian, V., Marco-Detchart, C., & Rincón, J. A. (2024). Application of machine vision techniques in low-cost devices to improve efficiency in precision farming. Sensors, 24(3), 937. https://doi.org/10.3390/s24030937
  • Kaldarova, M., Аkanova, A., Nazyrova, A., Mukanova, A., & Tynykulova, A. (2023). Identification Of Weeds In Fields Based On Computer Vision Technology. Eastern-European Journal of Enterprise Technologies, 124(2). https://doi.org/10.15587/1729-4061.2023.284600
  • Karahanlı, G., & Taşkın, C. (2024). Determining the growth stages of sunflower plants using deep learning methods. Journal of the Faculty of Engineering and Architecture of Gazi University, 39(3), 1455-1472. https://doi.org/10.17341/gazimmfd.1200615
  • Kini, A. S., Reddy, P. K., & Pai, S. N. (2023). Techniques of deep learning and image processing in plant leaf disease detection: A review. International Journal of Electrical and Computer Engineering (IJECE), 13(3), 3029-3040.
  • https://doi.org/10.11591/ijece.v13i3.pp3029-3040
  • Lu, D., Ye, J., Wang, Y., & Yu, Z. (2023). Plant detection and counting: Enhancing precision agriculture in UAV and general scenes. IEEE Access. https://doi.org/10.1109/ACCESS.2023.3325747
  • Luo, J., Li, B., & Leung, C. (2023). A survey of computer vision technologies in urban and controlled-environment agriculture. ACM Computing Surveys, 56(5), 1-39. https://doi.org/10.1145/1122445.1122456
  • Mao, S., Li, Y., Ma, Y., Zhang, B., Zhou, J., & Wang, K. (2020). Automatic cucumber recognition algorithm for harvesting robots in the natural environment using deep learning and multi-feature fusion. Computers and Electronics in Agriculture, 170,105254.
  • https://doi.org/1016/j.compag.2020.105254
  • Nath, S. (2024). A vision of precision agriculture: Balance between agricultural sustainability and environmental stewardship. Agronomy Journal, 116(3), 1126-1143. https://doi.org/10.1002/agj2.21405
  • Palacios, F., Diago, M. P., Melo-Pinto, P., & Tardaguila, J. (2023). Early yield prediction in different grapevine varieties using computer vision and machine learning. Precision Agriculture, 24(2), 407-435.https://doi.org/10.1007/s11119-022-09950-y
  • Punithavathi, R., Rani, A. D. C., Sughashini, K., Kurangi, C., Nirmala, M., Ahmed, H. F. T., & Balamurugan, S. (2023). Computer Vision and Deep Learning-enabled Weed Detection Model for Precision Agriculture. Comput. Syst. Sci. Eng., 44(3), 2759-2774. https://doi.org/10.32604/csse.2023.027647
  • Rai, N., Mahecha, M. V., Christensen, A., Quanbeck, J., Zhang, Y., Howatt, K., Ostlie, M., & Sun, X. (2023). Multi-format open-source weed image dataset for real-time weed identification in precision agriculture. Data in Brief, 51, 109691. https://doi.org/10.1016/j.dib.2023.109691
  • Redmon, J., & Farhadi, A. (2017). YOLO9000: better, faster, stronger. Proceedings of the IEEE conference on computer vision and pattern recognition, Honolulu, HI, USA.
  • Redmon, J., & Farhadi, A. (2018). Yolov3: An incremental improvement. In computer vision and pattern recognition. arXiv preprint arXiv:1804.02767. https://doi.org/10.48550/arXiv.1804.02767
  • Rivera Zarate, G. (2023). LiDAR applications in precision agriculture for cultivating crops: A review of recent advances. Instituto de Ingeniería y Tecnología (207), 107737. https://doi.org/10.1016/j.compag.2023.107737
  • Roshanianfard, A., Noguchi, N., Ardabili, S., Mako, C., & Mosavi, A. (2022). Autonomous robotic system for pumpkin harvesting. Agronomy, 12(7), 1594. https://doi.org/10.3390/agronomy12071594
  • Rudenko, M., Plugatar, Y., Korzin, V., Kazak, A., Gallini, N., & Gorbunova, N. (2023). The use of computer vision to improve the affinity of rootstock-graft combinations and identify diseases of grape seedlings. Inventions, 8(4), 92. https://doi.org/10.3390/inventions8040092
  • Sapkota, R., Stenger, J., Ostlie, M., & Flores, P. (2023). Towards reducing chemical usage for weed control in agriculture using UAS imagery analysis and computer vision techniques. Scientific Reports, 13(1), 6548. https://doi.org/10.1038/s41598-023-33042-0
  • Shin, Y. H., Park, M. J., Lee, O. Y., & Kim, J. O. (2020). Deep orthogonal transform feature for image denoising. IEEE Access, 8, 66898-66909. https://doi.org/10.1109/ACCESS.2020.2986827
  • Soeb, M. J. A., Jubayer, M. F., Tarin, T. A., Al Mamun, M. R., Ruhad, F. M., Parven, A., Mubarak, N. M., Karri, S. L., & Meftaul, I. M. (2023). Tea leaf disease detection and identification based on YOLOv7 (YOLO-T). Scientific Reports, 13(1), 6078. https://doi.org/10.1038/s41598-023-33270-4
  • Štaka, Z., & Mišić, M. (2023). Leaf counting in the presence of occlusion in Arabidopsis thaliana plant using convolutional neural networks. Journal of Electronic Imaging, 32(5), 052407-052407. https://doi.org/10.1117/1.jei.32.5.052407
  • Ubaid, M. T., & Javaid, S. (2024). Precision agriculture: Computer vision-enabled sugarcane plant counting in the tillering phase. Journal of Imaging, 10(5), 102. https://doi.org/10.3390/jimaging10050102
  • Wang, H., Ji, C., Gu, B., & Tian, G. (2013). Cucumber image segmentation based on weighted connection coefficient pulse coupled neural network. Nongye Jixie Xuebao= Transactions of the Chinese Society for Agricultural Machinery, 44(3), 204-208. https://doi.org/10.6041/j.issn.1000-1298.2013.03.037
  • Wang, Y., Wang, Y., & Zhao, J. (2022). MGA-YOLO: A lightweight one-stage network for apple leaf disease detection. Frontiers in Plant Science, 13, 927424. https://doi.org/10.3389/fpls.2022.927424
  • Xiao, F., Wang, H., Li, Y., Cao, Y., Lv, X., & Xu, G. (2023). Object detection and recognition techniques based on digital image processing and traditional machine learning for fruit and vegetable harvesting robots: an overview and review. Agronomy, 13(3), 639. https://doi.org/10.3390/agronomy13030639
  • Xu, J., & Lu, Y. (2023). Openweedgui: an open-source graphical user interface for weed imaging and detection. Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VIII. https://doi.org/10.1117/12.2664131
  • Zhu, L., Li, Z., Li, C., Wu, J., & Yue, J. (2018). High performance vegetable classification from images based on alexnet deep learning model. International Journal of Agricultural and Biological Engineering, 11(4),217-223. https://doi.org/10.25165/j.ijabe.20181104.2690
  • Zualkernan, I., Abuhani, D. A., Hussain, M. H., Khan, J., & El-Mohandes, M. (2023). Machine learning for precision agriculture using imagery from unmanned aerial vehicles (uavs): A survey. Drones, 7(6), 382. https://doi.org/10.3390/drones7060382
Year 2024, Volume: 34 Issue: 4, 669 - 689
https://doi.org/10.29133/yyutbd.1517109

Abstract

References

  • Alam, M. S., Alam, M., Tufail, M., Khan, M. U., Güneş, A., Salah, B., Nasir, F. E., Saleem, W., & Khan, M. T. (2022). TobSet: A new tobacco crop and weeds image dataset and its utilization for vision-based spraying by agricultural robots. Applied Sciences, 12(3), 1308. https://doi.org/10.3390/app12031308
  • Arad, B., Balendonck, J., Barth, R., Ben‐Shahar, O., Edan, Y., Hellström, T., Hemming, J., Kurtser, P., Ringdahl, O., & Tielen, T. (2020). Development of a sweet pepper harvesting robot. Journal of Field Robotics, 37(6), 1027-1039. https://doi.org/10.1002/rob.21937
  • Atalay, M., & Çelik, E. (2017). Artificial intelligence and machine learning applications in big data analysis. Mehmet Akif Ersoy University Social Sciences Institute Journal, 9(22), 155-172. https://doi.org/https://doi.org/10.20875/makusobed.309727
  • Altınbilek, H. F., & Kızıl, Ü. (2022). Identification of Paddy Rice Diseases Using Deep Convolutional Neural Networks. Yuzuncu Yıl University Journal of Agricultural Sciences, 32(4), 705-713. https://doi.org/10.29133/yyutbd.1140911
  • Bai, W., Zhao, J., Dai, C., Zhang, H., Zhao, L., Ji, Z., & Ganchev, I. (2023). Two novel models for traffic sign detection based on YOLOv5s. Axioms, 12(2), 160. https://doi.org/10.3390/axioms12020160
  • Barman, U., Das, D., Sonowal, G., Dutta, M. (2024). Innovative Approaches to Rice (Oryza sativa) Crop Health: A Comprehensive Analysis of Deep Transfer Learning for Early Disease Detection. Yuzuncu Yıl University Journal of Agricultural Sciences, 34(2), 314-322. https://doi.org/10.29133/yyutbd.1402821
  • Bati, C. T., & Ser, G. (2023). Effects of Data Augmentation Methods on YOLO v5s: Application of Deep Learning with Pytorch for Individual Cattle Identification. Yuzuncu Yıl University Journal of Agricultural Sciences, 33(3), 363-376. https://doi.org/10.29133/yyutbd.1246901
  • Chen, W., Lu, S., Liu, B., Chen, M., Li, G., & Qian, T. (2022). CitrusYOLO: A algorithm for citrus detection under orchard environment based on YOLOv4. Multimedia Tools and Applications, 81, 31363–31389.https://doi.org/10.1007/s11042-022-12687-5
  • Darwin, B., Dharmaraj, P., Prince, S., Popescu, D. E., & Hemanth, D. J. (2021). Recognition of bloom/yield in crop images using deep learning models for smart agriculture: A review. Agronomy, 11(4), 646. https://doi.org/10.3390/agronomy11040646
  • Deng, L., & Yu, D. (2014). Deep learning: methods and applications. Foundations and Trends® in Signal Processing, 7(3–4), 197-387.https://doi.org/10.1561/2000000039
  • Droukas, L., Doulgeri, Z., Tsakiridis, N. L., Triantafyllou, D., Kleitsiotis, I., Mariolis, I., Giakoumis, D., Tzovaras, D., Kateris, D., & Bochtis, D. (2023). A Survey of Robotic Harvesting Systems and Enabling Technologies. Journal of Intelligent & Robotic Systems, 107(2), 21.https://doi.org/10.1007/s10846-022-01793-z
  • Du, F. J., & Jiao, S. J. (2022). Improvement of lightweight convolutional neural network model based on YOLO algorithm and its research in pavement defect detection. Sensors, 22(9), 3537. https://doi.org/10.3390/s22093537
  • Elavarasan, D., & Vincent, P. D. (2020). Crop yield prediction using deep reinforcement learning model for sustainable agrarian applications. IEEE Access, 8, 86886-86901. https://doi.org/10.1109/access.2020.2992480
  • Fountas, S., Mylonas, N., Malounas, I., Rodias, E., Hellmann Santos, C., & Pekkeriet, E. (2020). Agricultural robotics for field operations. Sensors, 20(9), 2672. https://doi.org/10.3390/s20092672
  • Gholipoor, M., & Fathollah, N. (2019). Fruit yield prediction of pepper using artificial neural network. Scientia Horticulturae, 250, 249-253.https://doi.org/10.1016/j.scienta.2019.02.040
  • Hong, W., Ma, Z., Ye, B., Yu, G., Tang, T., & Zheng, M. (2023). Detection of green asparagus in complex environments based on the improved YOLOv5 algorithm. Sensors, 23(3), 1562. https://doi.org/10.3390/s23031562
  • İmak, A., Doğan, G., Şengür, A., & Ergen, B. (2023). A new method based on extracting, combining and selecting deep features from natural and synthetic data for classification of grapevine leaf species. Int. J. Pure App. Sci., 9(1), 46-55. https://doi.org/10.29132/ijpas.1144615
  • Jaramillo-Hernández, J. F., Julian, V., Marco-Detchart, C., & Rincón, J. A. (2024). Application of machine vision techniques in low-cost devices to improve efficiency in precision farming. Sensors, 24(3), 937. https://doi.org/10.3390/s24030937
  • Kaldarova, M., Аkanova, A., Nazyrova, A., Mukanova, A., & Tynykulova, A. (2023). Identification Of Weeds In Fields Based On Computer Vision Technology. Eastern-European Journal of Enterprise Technologies, 124(2). https://doi.org/10.15587/1729-4061.2023.284600
  • Karahanlı, G., & Taşkın, C. (2024). Determining the growth stages of sunflower plants using deep learning methods. Journal of the Faculty of Engineering and Architecture of Gazi University, 39(3), 1455-1472. https://doi.org/10.17341/gazimmfd.1200615
  • Kini, A. S., Reddy, P. K., & Pai, S. N. (2023). Techniques of deep learning and image processing in plant leaf disease detection: A review. International Journal of Electrical and Computer Engineering (IJECE), 13(3), 3029-3040.
  • https://doi.org/10.11591/ijece.v13i3.pp3029-3040
  • Lu, D., Ye, J., Wang, Y., & Yu, Z. (2023). Plant detection and counting: Enhancing precision agriculture in UAV and general scenes. IEEE Access. https://doi.org/10.1109/ACCESS.2023.3325747
  • Luo, J., Li, B., & Leung, C. (2023). A survey of computer vision technologies in urban and controlled-environment agriculture. ACM Computing Surveys, 56(5), 1-39. https://doi.org/10.1145/1122445.1122456
  • Mao, S., Li, Y., Ma, Y., Zhang, B., Zhou, J., & Wang, K. (2020). Automatic cucumber recognition algorithm for harvesting robots in the natural environment using deep learning and multi-feature fusion. Computers and Electronics in Agriculture, 170,105254.
  • https://doi.org/1016/j.compag.2020.105254
  • Nath, S. (2024). A vision of precision agriculture: Balance between agricultural sustainability and environmental stewardship. Agronomy Journal, 116(3), 1126-1143. https://doi.org/10.1002/agj2.21405
  • Palacios, F., Diago, M. P., Melo-Pinto, P., & Tardaguila, J. (2023). Early yield prediction in different grapevine varieties using computer vision and machine learning. Precision Agriculture, 24(2), 407-435.https://doi.org/10.1007/s11119-022-09950-y
  • Punithavathi, R., Rani, A. D. C., Sughashini, K., Kurangi, C., Nirmala, M., Ahmed, H. F. T., & Balamurugan, S. (2023). Computer Vision and Deep Learning-enabled Weed Detection Model for Precision Agriculture. Comput. Syst. Sci. Eng., 44(3), 2759-2774. https://doi.org/10.32604/csse.2023.027647
  • Rai, N., Mahecha, M. V., Christensen, A., Quanbeck, J., Zhang, Y., Howatt, K., Ostlie, M., & Sun, X. (2023). Multi-format open-source weed image dataset for real-time weed identification in precision agriculture. Data in Brief, 51, 109691. https://doi.org/10.1016/j.dib.2023.109691
  • Redmon, J., & Farhadi, A. (2017). YOLO9000: better, faster, stronger. Proceedings of the IEEE conference on computer vision and pattern recognition, Honolulu, HI, USA.
  • Redmon, J., & Farhadi, A. (2018). Yolov3: An incremental improvement. In computer vision and pattern recognition. arXiv preprint arXiv:1804.02767. https://doi.org/10.48550/arXiv.1804.02767
  • Rivera Zarate, G. (2023). LiDAR applications in precision agriculture for cultivating crops: A review of recent advances. Instituto de Ingeniería y Tecnología (207), 107737. https://doi.org/10.1016/j.compag.2023.107737
  • Roshanianfard, A., Noguchi, N., Ardabili, S., Mako, C., & Mosavi, A. (2022). Autonomous robotic system for pumpkin harvesting. Agronomy, 12(7), 1594. https://doi.org/10.3390/agronomy12071594
  • Rudenko, M., Plugatar, Y., Korzin, V., Kazak, A., Gallini, N., & Gorbunova, N. (2023). The use of computer vision to improve the affinity of rootstock-graft combinations and identify diseases of grape seedlings. Inventions, 8(4), 92. https://doi.org/10.3390/inventions8040092
  • Sapkota, R., Stenger, J., Ostlie, M., & Flores, P. (2023). Towards reducing chemical usage for weed control in agriculture using UAS imagery analysis and computer vision techniques. Scientific Reports, 13(1), 6548. https://doi.org/10.1038/s41598-023-33042-0
  • Shin, Y. H., Park, M. J., Lee, O. Y., & Kim, J. O. (2020). Deep orthogonal transform feature for image denoising. IEEE Access, 8, 66898-66909. https://doi.org/10.1109/ACCESS.2020.2986827
  • Soeb, M. J. A., Jubayer, M. F., Tarin, T. A., Al Mamun, M. R., Ruhad, F. M., Parven, A., Mubarak, N. M., Karri, S. L., & Meftaul, I. M. (2023). Tea leaf disease detection and identification based on YOLOv7 (YOLO-T). Scientific Reports, 13(1), 6078. https://doi.org/10.1038/s41598-023-33270-4
  • Štaka, Z., & Mišić, M. (2023). Leaf counting in the presence of occlusion in Arabidopsis thaliana plant using convolutional neural networks. Journal of Electronic Imaging, 32(5), 052407-052407. https://doi.org/10.1117/1.jei.32.5.052407
  • Ubaid, M. T., & Javaid, S. (2024). Precision agriculture: Computer vision-enabled sugarcane plant counting in the tillering phase. Journal of Imaging, 10(5), 102. https://doi.org/10.3390/jimaging10050102
  • Wang, H., Ji, C., Gu, B., & Tian, G. (2013). Cucumber image segmentation based on weighted connection coefficient pulse coupled neural network. Nongye Jixie Xuebao= Transactions of the Chinese Society for Agricultural Machinery, 44(3), 204-208. https://doi.org/10.6041/j.issn.1000-1298.2013.03.037
  • Wang, Y., Wang, Y., & Zhao, J. (2022). MGA-YOLO: A lightweight one-stage network for apple leaf disease detection. Frontiers in Plant Science, 13, 927424. https://doi.org/10.3389/fpls.2022.927424
  • Xiao, F., Wang, H., Li, Y., Cao, Y., Lv, X., & Xu, G. (2023). Object detection and recognition techniques based on digital image processing and traditional machine learning for fruit and vegetable harvesting robots: an overview and review. Agronomy, 13(3), 639. https://doi.org/10.3390/agronomy13030639
  • Xu, J., & Lu, Y. (2023). Openweedgui: an open-source graphical user interface for weed imaging and detection. Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VIII. https://doi.org/10.1117/12.2664131
  • Zhu, L., Li, Z., Li, C., Wu, J., & Yue, J. (2018). High performance vegetable classification from images based on alexnet deep learning model. International Journal of Agricultural and Biological Engineering, 11(4),217-223. https://doi.org/10.25165/j.ijabe.20181104.2690
  • Zualkernan, I., Abuhani, D. A., Hussain, M. H., Khan, J., & El-Mohandes, M. (2023). Machine learning for precision agriculture using imagery from unmanned aerial vehicles (uavs): A survey. Drones, 7(6), 382. https://doi.org/10.3390/drones7060382
There are 46 citations in total.

Details

Primary Language English
Subjects Agricultural Automatization
Journal Section Articles
Authors

Erhan Kahya 0000-0001-7768-9190

Early Pub Date December 15, 2024
Publication Date
Submission Date July 16, 2024
Acceptance Date October 22, 2024
Published in Issue Year 2024 Volume: 34 Issue: 4

Cite

APA Kahya, E. (2024). Use of YOLOv5 Trained Model for Robotic Courgette Harvesting and Efficiency Analysis. Yuzuncu Yıl University Journal of Agricultural Sciences, 34(4), 669-689. https://doi.org/10.29133/yyutbd.1517109
Creative Commons License
Yuzuncu Yil University Journal of Agricultural Sciences by Van Yuzuncu Yil University Faculty of Agriculture is licensed under a Creative Commons Attribution 4.0 International License.