Research Article
BibTex RIS Cite

Strawberry Ripeness Assessment Via Camouflage-Based Data Augmentation for Automated Strawberry Picking Robot

Year 2022, , 1589 - 1602, 31.07.2022
https://doi.org/10.29130/dubited.1075572

Abstract

Vision-based strawberry picking and placing is one of the main objectives for strawberry harvesting robots to complete visual servoing procedures accurately. Occlusion is the main challenge in strawberry ripeness detection for agriculture robots. In this study, strawberry ripeness detection was proposed using a camouflage-based data augmentation strategy to simulate the natural environment of strawberry harvesting conditions. Yolov4, Yolov4 tiny and Yolov4 scaled, and their traditional data augmentation and camouflage-based data augmentation derivatives were used to find out the effect of camouflage-based augmentation technique in overcoming the occlusion issue. Then the results were mainly evaluated based on mean Intersection over Union (IoU), F-1 score, average precision (AP) for ripe and unripe strawberries and frame per second (fps). Yolov4 tiny with camouflage-based data augmentation technique has demonstrated superior performance in detecting ripe and unripe strawberries with 84% IoU accuracy ~99% AP for ripe and unripe strawberries at an average of 206-fps, satisfying the agriculture strawberry harvesting robot operation need. The performance of the suggested technique was then tested successfully using a dataset termed the challenge dataset in this study to demonstrate its performance in a complex and occluded strawberry harvesting environment. Camouflage-based data augmentation technique helps to increase the detection procedure of ripe and unripe strawberries toward autonomous strawberry harvesting robot.

References

  • [1] N. Zhu et al., “Deep learning for smart agriculture: Concepts, tools, applications, and opportunities,” Int. J. Agric. Biol. Eng., vol. 11, no. 4, pp. 32–44, 2018, doi: 10.25165/j.ijabe.20181104.4475.
  • [2] Y. Xiong, Y. Ge, L. Grimstad, and P. J. From, “An autonomous strawberry-harvesting robot: Design, development, integration, and field evaluation,” J. F. Robot., vol. 37, no. 2, pp. 202–224, 2020, doi: 10.1002/rob.21889.
  • [3] A. De Preter, J. Anthonis, and J. De Baerdemaeker, “Development of a Robot for Harvesting Strawberries,” IFAC-PapersOnLine, vol. 51, no. 17, pp. 14–19, 2018, doi: https://doi.org/10.1016/j.ifacol.2018.08.054.
  • [4] Q. Feng, J. Chen, M. Zhang, and X. Wang, “Design and Test of Harvesting Robot for Table-top Cultivated Strawberry,” in 2019 WRC Symposium on Advanced Robotics and Automation (WRC SARA), 2019, pp. 80–85, doi: 10.1109/WRC-SARA.2019.8931922.
  • [5] C. W. Bac, E. J. van Henten, J. Hemming, and Y. Edan, “Harvesting robots for high‐value crops: State‐of‐the‐art review and challenges ahead,” J. F. Robot., vol. 31, no. 6, pp. 888–911, 2014.
  • [6] D. Surya Prabha and J. Satheesh Kumar, “Assessment of banana fruit maturity by image processing technique,” J. Food Sci. Technol., vol. 52, no. 3, pp. 1316–1327, 2015, doi: 10.1007/s13197-013-1188-3.
  • [7] W. Castro, J. Oblitas, M. De-La-Torre, C. Cotrina, K. Bazán, and H. Avila-George, “Classification of Cape Gooseberry Fruit According to its Level of Ripeness Using Machine Learning Techniques and Different Color Spaces,” IEEE Access, vol. 7, pp. 27389–27400, 2019, doi: 10.1109/ACCESS.2019.2898223.
  • [8] N. El-Bendary, E. El Hariri, A. E. Hassanien, and A. Badr, “Using machine learning techniques for evaluating tomato ripeness,” Expert Syst. Appl., vol. 42, no. 4, pp. 1892–1905, 2015, doi: https://doi.org/10.1016/j.eswa.2014.09.057.
  • [9] N. Goel and P. Sehgal, “Fuzzy classification of pre-harvest tomatoes for ripeness estimation – An approach based on automatic rule learning using decision tree,” Appl. Soft Comput., vol. 36, pp. 45–56, 2015, doi: https://doi.org/10.1016/j.asoc.2015.07.009.
  • [10] X. Wei, K. Jia, J. Lan, Y. Li, Y. Zeng, and C. Wang, “Automatic method of fruit object extraction under complex agricultural background for vision system of fruit picking robot,” Optik (Stuttg)., vol. 125, no. 19, pp. 5684–5689, 2014, doi: https://doi.org/10.1016/j.ijleo.2014.07.001.
  • [11] F. Sadak, M. Saadat, and A. M. Hajiyavand, “Real-Time Deep Learning-Based Image Recognition for Applications in Automated Positioning and Injection of Biological Cells,” Comput. Biol. Med., p. 103976, 2020, doi: 10.1016/j.compbiomed.2020.103976.
  • [12] Y. Onishi, T. Yoshida, H. Kurita, T. Fukao, H. Arihara, and A. Iwai, “An automated fruit harvesting robot by using deep learning,” Robomech J., vol. 6, no. 1, pp. 1–8, 2019, doi: 10.1186/s40648-019-0141-2.
  • [13] M. H. Saleem, J. Potgieter, and K. M. Arif, “Automation in Agriculture by Machine and Deep Learning Techniques: A Review of Recent Developments,” Precis. Agric., pp. 1–39, 2021, doi: 10.1007/s11119-021-09806-x.
  • [14] F. Gao et al., “Multi-class fruit-on-plant detection for apple in SNAP system using Faster R-CNN,” Comput. Electron. Agric., vol. 176, p. 105634, 2020, doi: https://doi.org/10.1016/j.compag.2020.105634.
  • [15] S. Bargoti and J. Underwood, “Deep fruit detection in orchards,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), 2017, pp. 3626–3633, doi: 10.1109/ICRA.2017.7989417.
  • [16] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016, pp. 779–788, doi: 10.1109/CVPR.2016.91.
  • [17] Y. Tian, G. Yang, Z. Wang, H. Wang, E. Li, and Z. Liang, “Apple detection during different growth stages in orchards using the improved YOLO-V3 model,” Comput. Electron. Agric., vol. 157, pp. 417–426, 2019, doi: https://doi.org/10.1016/j.compag.2019.01.012.
  • [18] M. O. Lawal, “Tomato detection based on modified YOLOv3 framework,” Sci. Rep., vol. 11, no. 1, p. 1447, 2021, doi: 10.1038/s41598-021-81216-5.
  • [19] A. Koirala, K. B. Walsh, Z. Wang, and C. McCarthy, “Deep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of ‘MangoYOLO,’” Precis. Agric., vol. 20, no. 6, pp. 1107–1135, 2019, doi: 10.1007/s11119-019-09642-0.
  • [20] A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, “YOLOv4: Optimal Speed and Accuracy of Object Detection,” 2020, [Online]. Available: http://arxiv.org/abs/2004.10934.
  • [21] H. Habaragamuwa, Y. Ogawa, T. Suzuki, T. Shiigi, M. Ono, and N. Kondo, “Detecting greenhouse strawberries (mature and immature), using deep convolutional neural network,” Eng. Agric. Environ. Food, vol. 11, no. 3, pp. 127–138, 2018, doi: https://doi.org/10.1016/j.eaef.2018.03.001.
  • [22] P. Pawara, E. Okafor, L. Schomaker, and M. Wiering, “Data Augmentation for Plant Classification,” in European Conference on Computer Vision, 2017, pp. 615–626.
  • [23] A. Hernández-García and P. König, “Further Advantages of Data Augmentation on Convolutional Neural Networks BT - Artificial Neural Networks and Machine Learning – ICANN 2018,” in International Conference on Artificial Neural Networks, 2018, pp. 95–103.
  • [24] A. Mikołajczyk and M. Grochowski, “Data augmentation for improving deep learning in image classification problem,” in 2018 International Interdisciplinary PhD Workshop (IIPhDW), 2018, pp. 117–122, doi: 10.1109/IIPHDW.2018.8388338.
  • [25] A. Kamilaris and F. X. Prenafeta-Boldú, “Deep learning in agriculture: A survey,” Comput. Electron. Agric., vol. 147, pp. 70–90, 2018, doi: https://doi.org/10.1016/j.compag.2018.02.016.
  • [26] K. A. Steen, P. Christiansen, H. Karstoft, and R. N. Jørgensen, “Using Deep Learning to Challenge Safety Standard for Highly Autonomous Machines in Agriculture,” Journal of Imaging , vol. 2, no. 1. 2016, doi: 10.3390/jimaging2010006.
  • [27] M. Dyrmann, H. Karstoft, and H. S. Midtiby, “Plant species classification using deep convolutional neural network,” Biosyst. Eng., vol. 151, pp. 72–80, 2016, doi: https://doi.org/10.1016/j.biosystemseng.2016.08.024.
  • [28] P. Christiansen, L. N. Nielsen, K. A. Steen, R. N. Jørgensen, and H. Karstoft, “DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field,” Sensors , vol. 16, no. 11. 2016, doi: 10.3390/s16111904.
  • [29] Tzutalin, “LabelImg.” 2015, [Online]. Available: https://github.com/tzutalin/labelImg.
  • [30] C.-Y. Wang, A. Bochkovskiy, and H.-Y. M. Liao, “Scaled-YOLOv4: Scaling Cross Stage Partial Network,” in In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021, pp. 13029–13038, [Online]. Available: http://arxiv.org/abs/2011.08036.
  • [31] Y. Shao et al., “Assessment of strawberry ripeness using hyperspectral imaging,” Anal. Lett., vol. 54, no. 10, pp. 1547–1560, 2020.

Otonom Çilek Toplama Robotu İçin Kamuflaj Tabanlı Veri Artırma Yoluyla Çilek Olgunluğu Değerlendirmesi

Year 2022, , 1589 - 1602, 31.07.2022
https://doi.org/10.29130/dubited.1075572

Abstract

Görüye dayalı çilek toplama ve yerleştirme, çilek hasat robotlarının görsel servo prosedürlerini doğru bir şekilde tamamlaması için ana hedeflerden biridir. Tarım robotları için çilek olgunluğunun saptanmasındaki ana zorluk oklüzyondur. Bu çalışmada, çilek hasat koşullarının doğal ortamını simüle etmek için kamuflaj tabanlı bir veri artırma stratejisi kullanılarak çilek olgunluğu tespiti önerilmiştir. Kamuflaj tabanlı veri artırma tekniğinin oklüzyon sorununun üstesinden gelmedeki etkisini bulmak için Yolov4, Yolov4 tiny ve Yolov4 scaled ve bunların geleneksel veri artırma ve kamuflaj tabanlı veri artırma türevleri kullanıldı. Daha sonra sonuçlar esas olarak mean Intersection over Union (IoU), F-1 score, average precision (AP), frame per second (fps) temel alınarak değerlendirildi. Kamuflaj tabanlı veri artırma tekniğine sahip Yolov4 tiny, ortalama 206 fps'de olgun ve olgunlaşmamış çilekler için %84 IoU doğruluğu ~%99 AP ile olgun ve olgunlaşmamış çilekleri tespit etmede üstün performans göstererek, tarım çilek hasat robotu operasyon ihtiyacını karşıladı. Önerilen tekniğin performansı, daha sonra, karmaşık ve oklüzyon olan bir çilek hasat ortamında performansını göstermek için bu çalışmada zorluk veri seti olarak adlandırılan bir veri seti kullanılarak başarılı bir şekilde test edildi. Kamuflaj tabanlı veri artırma tekniği, otonom çilek hasat robotları için olgun ve olgunlaşmamış çileğin tespit prosedürünü artırmaya yardımcı oldu.

References

  • [1] N. Zhu et al., “Deep learning for smart agriculture: Concepts, tools, applications, and opportunities,” Int. J. Agric. Biol. Eng., vol. 11, no. 4, pp. 32–44, 2018, doi: 10.25165/j.ijabe.20181104.4475.
  • [2] Y. Xiong, Y. Ge, L. Grimstad, and P. J. From, “An autonomous strawberry-harvesting robot: Design, development, integration, and field evaluation,” J. F. Robot., vol. 37, no. 2, pp. 202–224, 2020, doi: 10.1002/rob.21889.
  • [3] A. De Preter, J. Anthonis, and J. De Baerdemaeker, “Development of a Robot for Harvesting Strawberries,” IFAC-PapersOnLine, vol. 51, no. 17, pp. 14–19, 2018, doi: https://doi.org/10.1016/j.ifacol.2018.08.054.
  • [4] Q. Feng, J. Chen, M. Zhang, and X. Wang, “Design and Test of Harvesting Robot for Table-top Cultivated Strawberry,” in 2019 WRC Symposium on Advanced Robotics and Automation (WRC SARA), 2019, pp. 80–85, doi: 10.1109/WRC-SARA.2019.8931922.
  • [5] C. W. Bac, E. J. van Henten, J. Hemming, and Y. Edan, “Harvesting robots for high‐value crops: State‐of‐the‐art review and challenges ahead,” J. F. Robot., vol. 31, no. 6, pp. 888–911, 2014.
  • [6] D. Surya Prabha and J. Satheesh Kumar, “Assessment of banana fruit maturity by image processing technique,” J. Food Sci. Technol., vol. 52, no. 3, pp. 1316–1327, 2015, doi: 10.1007/s13197-013-1188-3.
  • [7] W. Castro, J. Oblitas, M. De-La-Torre, C. Cotrina, K. Bazán, and H. Avila-George, “Classification of Cape Gooseberry Fruit According to its Level of Ripeness Using Machine Learning Techniques and Different Color Spaces,” IEEE Access, vol. 7, pp. 27389–27400, 2019, doi: 10.1109/ACCESS.2019.2898223.
  • [8] N. El-Bendary, E. El Hariri, A. E. Hassanien, and A. Badr, “Using machine learning techniques for evaluating tomato ripeness,” Expert Syst. Appl., vol. 42, no. 4, pp. 1892–1905, 2015, doi: https://doi.org/10.1016/j.eswa.2014.09.057.
  • [9] N. Goel and P. Sehgal, “Fuzzy classification of pre-harvest tomatoes for ripeness estimation – An approach based on automatic rule learning using decision tree,” Appl. Soft Comput., vol. 36, pp. 45–56, 2015, doi: https://doi.org/10.1016/j.asoc.2015.07.009.
  • [10] X. Wei, K. Jia, J. Lan, Y. Li, Y. Zeng, and C. Wang, “Automatic method of fruit object extraction under complex agricultural background for vision system of fruit picking robot,” Optik (Stuttg)., vol. 125, no. 19, pp. 5684–5689, 2014, doi: https://doi.org/10.1016/j.ijleo.2014.07.001.
  • [11] F. Sadak, M. Saadat, and A. M. Hajiyavand, “Real-Time Deep Learning-Based Image Recognition for Applications in Automated Positioning and Injection of Biological Cells,” Comput. Biol. Med., p. 103976, 2020, doi: 10.1016/j.compbiomed.2020.103976.
  • [12] Y. Onishi, T. Yoshida, H. Kurita, T. Fukao, H. Arihara, and A. Iwai, “An automated fruit harvesting robot by using deep learning,” Robomech J., vol. 6, no. 1, pp. 1–8, 2019, doi: 10.1186/s40648-019-0141-2.
  • [13] M. H. Saleem, J. Potgieter, and K. M. Arif, “Automation in Agriculture by Machine and Deep Learning Techniques: A Review of Recent Developments,” Precis. Agric., pp. 1–39, 2021, doi: 10.1007/s11119-021-09806-x.
  • [14] F. Gao et al., “Multi-class fruit-on-plant detection for apple in SNAP system using Faster R-CNN,” Comput. Electron. Agric., vol. 176, p. 105634, 2020, doi: https://doi.org/10.1016/j.compag.2020.105634.
  • [15] S. Bargoti and J. Underwood, “Deep fruit detection in orchards,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), 2017, pp. 3626–3633, doi: 10.1109/ICRA.2017.7989417.
  • [16] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016, pp. 779–788, doi: 10.1109/CVPR.2016.91.
  • [17] Y. Tian, G. Yang, Z. Wang, H. Wang, E. Li, and Z. Liang, “Apple detection during different growth stages in orchards using the improved YOLO-V3 model,” Comput. Electron. Agric., vol. 157, pp. 417–426, 2019, doi: https://doi.org/10.1016/j.compag.2019.01.012.
  • [18] M. O. Lawal, “Tomato detection based on modified YOLOv3 framework,” Sci. Rep., vol. 11, no. 1, p. 1447, 2021, doi: 10.1038/s41598-021-81216-5.
  • [19] A. Koirala, K. B. Walsh, Z. Wang, and C. McCarthy, “Deep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of ‘MangoYOLO,’” Precis. Agric., vol. 20, no. 6, pp. 1107–1135, 2019, doi: 10.1007/s11119-019-09642-0.
  • [20] A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, “YOLOv4: Optimal Speed and Accuracy of Object Detection,” 2020, [Online]. Available: http://arxiv.org/abs/2004.10934.
  • [21] H. Habaragamuwa, Y. Ogawa, T. Suzuki, T. Shiigi, M. Ono, and N. Kondo, “Detecting greenhouse strawberries (mature and immature), using deep convolutional neural network,” Eng. Agric. Environ. Food, vol. 11, no. 3, pp. 127–138, 2018, doi: https://doi.org/10.1016/j.eaef.2018.03.001.
  • [22] P. Pawara, E. Okafor, L. Schomaker, and M. Wiering, “Data Augmentation for Plant Classification,” in European Conference on Computer Vision, 2017, pp. 615–626.
  • [23] A. Hernández-García and P. König, “Further Advantages of Data Augmentation on Convolutional Neural Networks BT - Artificial Neural Networks and Machine Learning – ICANN 2018,” in International Conference on Artificial Neural Networks, 2018, pp. 95–103.
  • [24] A. Mikołajczyk and M. Grochowski, “Data augmentation for improving deep learning in image classification problem,” in 2018 International Interdisciplinary PhD Workshop (IIPhDW), 2018, pp. 117–122, doi: 10.1109/IIPHDW.2018.8388338.
  • [25] A. Kamilaris and F. X. Prenafeta-Boldú, “Deep learning in agriculture: A survey,” Comput. Electron. Agric., vol. 147, pp. 70–90, 2018, doi: https://doi.org/10.1016/j.compag.2018.02.016.
  • [26] K. A. Steen, P. Christiansen, H. Karstoft, and R. N. Jørgensen, “Using Deep Learning to Challenge Safety Standard for Highly Autonomous Machines in Agriculture,” Journal of Imaging , vol. 2, no. 1. 2016, doi: 10.3390/jimaging2010006.
  • [27] M. Dyrmann, H. Karstoft, and H. S. Midtiby, “Plant species classification using deep convolutional neural network,” Biosyst. Eng., vol. 151, pp. 72–80, 2016, doi: https://doi.org/10.1016/j.biosystemseng.2016.08.024.
  • [28] P. Christiansen, L. N. Nielsen, K. A. Steen, R. N. Jørgensen, and H. Karstoft, “DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field,” Sensors , vol. 16, no. 11. 2016, doi: 10.3390/s16111904.
  • [29] Tzutalin, “LabelImg.” 2015, [Online]. Available: https://github.com/tzutalin/labelImg.
  • [30] C.-Y. Wang, A. Bochkovskiy, and H.-Y. M. Liao, “Scaled-YOLOv4: Scaling Cross Stage Partial Network,” in In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021, pp. 13029–13038, [Online]. Available: http://arxiv.org/abs/2011.08036.
  • [31] Y. Shao et al., “Assessment of strawberry ripeness using hyperspectral imaging,” Anal. Lett., vol. 54, no. 10, pp. 1547–1560, 2020.
There are 31 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Articles
Authors

Ferhat Sadak 0000-0003-2391-4836

Publication Date July 31, 2022
Published in Issue Year 2022

Cite

APA Sadak, F. (2022). Strawberry Ripeness Assessment Via Camouflage-Based Data Augmentation for Automated Strawberry Picking Robot. Duzce University Journal of Science and Technology, 10(3), 1589-1602. https://doi.org/10.29130/dubited.1075572
AMA Sadak F. Strawberry Ripeness Assessment Via Camouflage-Based Data Augmentation for Automated Strawberry Picking Robot. DÜBİTED. July 2022;10(3):1589-1602. doi:10.29130/dubited.1075572
Chicago Sadak, Ferhat. “Strawberry Ripeness Assessment Via Camouflage-Based Data Augmentation for Automated Strawberry Picking Robot”. Duzce University Journal of Science and Technology 10, no. 3 (July 2022): 1589-1602. https://doi.org/10.29130/dubited.1075572.
EndNote Sadak F (July 1, 2022) Strawberry Ripeness Assessment Via Camouflage-Based Data Augmentation for Automated Strawberry Picking Robot. Duzce University Journal of Science and Technology 10 3 1589–1602.
IEEE F. Sadak, “Strawberry Ripeness Assessment Via Camouflage-Based Data Augmentation for Automated Strawberry Picking Robot”, DÜBİTED, vol. 10, no. 3, pp. 1589–1602, 2022, doi: 10.29130/dubited.1075572.
ISNAD Sadak, Ferhat. “Strawberry Ripeness Assessment Via Camouflage-Based Data Augmentation for Automated Strawberry Picking Robot”. Duzce University Journal of Science and Technology 10/3 (July 2022), 1589-1602. https://doi.org/10.29130/dubited.1075572.
JAMA Sadak F. Strawberry Ripeness Assessment Via Camouflage-Based Data Augmentation for Automated Strawberry Picking Robot. DÜBİTED. 2022;10:1589–1602.
MLA Sadak, Ferhat. “Strawberry Ripeness Assessment Via Camouflage-Based Data Augmentation for Automated Strawberry Picking Robot”. Duzce University Journal of Science and Technology, vol. 10, no. 3, 2022, pp. 1589-02, doi:10.29130/dubited.1075572.
Vancouver Sadak F. Strawberry Ripeness Assessment Via Camouflage-Based Data Augmentation for Automated Strawberry Picking Robot. DÜBİTED. 2022;10(3):1589-602.