Araştırma Makalesi
BibTex RIS Kaynak Göster

Görüntü Tabanlı Dijital Bitki Fenotiplemesi için Konvolüsyonel Sinir Ağları

Yıl 2020, Ejosat Özel Sayı 2020 (HORA), 338 - 342, 15.08.2020
https://doi.org/10.31590/ejosat.780087

Öz

Bitkiler çevrenin en önemli bileşenlerinden biridir. Küresel ısınmanın, sürdürülebilir ürün yetiştirme programlarını zorlaştıran, kuraklık gibi etkileri sebebiyle, milyonlarca insan yetersiz beslenmektedir. Bu çalışmanın amacı bilgisayar görüsü, resim tabanlı makine öğrenmesi teknikleri ve evrişimli sinir ağları kullanarak, Uluslararası Bitki Fenotipleme Ağı tarafından araştırmacılara yardımcı olmak amacıyla, bitki yetiştiricilerinin arzu edeceği mahsul özelliklerine sahip ve daha zorlu çevre koşullarına dayanabilecek, canlı bitkilerin üretimine yardımcı olacak genleri belirlemek için, bir model sunmak ve test etmektir. Ayrıca, Türkiye'de ve çevresinde türünün ilk örneği olan bu çalışma, tarımın bu alanında gelecekteki akademik araştırmalar için bir zemin sağlamayı amaçlamaktadır. Seçilen ölçüt mutantların sınıflandırılma kriteridir (bitki hastalık tespiti). Bu araştırmada seçilen veri kümesi, Türkiye'de bulunabilecek başlıca bitki ürünlerinden mısır ve üzümdür. Üzüm ve mısırı etkileyen üç farklı bitki hastalığı sırasıyla kullanılmıştır ve özel evrişimli sinir ağı için toplam 8 farklı sınıfta, eğitim ve test olmak üzere 1600 açıklamalı görüntüye karşılık gelen sağlıklı üzüm ve mısır veri kümesi görüntüsü eklenmiştir. Sonuçlar, geliştirilen modelin eğitimden sonra test veri kümesinde % 97.03 doğruluk elde ettiğini göstermektedir. Böylece çalışmada, geliştirilen modelin şu anda kullanılan evrişimli sinir ağı modellerinden daha iyi performans gösterdiği ve görüntü algılama alanında daha fazla araştırma için bir temel olarak kullanılabileceği sonucuna varılmıştır.

Kaynakça

  • Bioinformatics: The Machine Learning Approach, Pierre Baldi and Søren Brunak
  • Brahimi M, Boukhalfa K, Moussaoui A (2017). Deep Learning for Tomato Diseases: Classification and Symptoms Visualization. Applied Artificial Intelligence
  • D. F. Specht (1988). Probabilistic Neural Networks for Classification Mapping, or Associative Memory, IEEE International Conference on Neural Networks, vol. 1.
  • D. Gavrila and V. Philomin (1999). Real-time Object Detection for Smart Vehicles,” Computer Vision, 1999. The Proceedings of the Seventh IEEE International Conference on, vol. 1.
  • D. Lowe (1999) Object Recognition from Local Scale-invariant Features, IEEE International Conference on Computer Vision, vol. 2, pp. 1150–1157.
  • Glorot, X., and Bengio, Y. (2010). Understanding the Difficulty of Training Deep Feedforward Neural Networks, International Conference on Artificial Intelligence and Statistics, Society for Artificial Intelligence and Statistics.
  • Großkinsky, D. K., Svensgaard, J., Christensen, S., and Roitsch, T. (2015). Plant Phenomics and the Need for Physiological Phenotyping Across Scales to Narrow the Genotype-to-phenotype Knowledge Gap, J. Exp. Bot. 66, 5429–5440.
  • Hartmann, A., Czauderna, T., Hoffmann, R., Stein, N., and Schreiber, F. (2011). HTPheno: An Image Analysis Pipeline for High-Throughput Plant Phenotyping. BMC Bioinformatics 12:148.
  • Jordan Ubbens, and Ian Stavness (2017). Deep Plant Phenomics: A Deep Learning Platform for Complex Plant Phenotyping Tasks’.
  • Kawasaki Y, Uga H, Kagiwada S, Iyatomi H (2015) Basic Study of Automated Diagnosis of Viral Plant Diseases Using Convolutional Neural Networks. International Symposium on Advances in Visual Computing, NV, USA, pp. 63-86.
  • Kingma, D. P., and Ba, J. L. (2015). “Adam: A Method for Stochastic Optimization, International Conference on Learning Representations, San Diego, CA, pp. 1–15.
  • Klukas, C., Chen, D., and Pape, J.-M. (2014). Integrated Analysis Platform: An Open-source Information System for High-throughput Plant Phenotyping. Plant Physiology, 165, pp. 506–518.
  • Kokorian, J., Polder, G., Keurentjes, J.J.B., Vreugdenhil, D., Olortegui Guzman, M., (2010). An Image-based Measurement Setup for Automated Phenotyping of Plants. International Conference on Image User and Developer, pp. 178–182.
  • M. T. Hagan, H. B. Demut, and M. H. Beale (2002). Neural Network Design.
  • Önder K, Mehmet Ö. (2016). Farklı Lokasyonlarda Yetişen Yoncanın Bazı Fenotip Özelliklerinin Görüntü İşleme Yöntemi ile Belirlenmesi’’, Akdeniz Üniversitesi Ziraat Fakültesi Dergisi, 28:2.
  • Sladojevic S, Arsenovic M, Anderla A, Culibrk D, Stefanovic D (2016) Deep Neural Networks Based Recognition of Plant Diseases by Leaf Image Classification, Computational Intelligence and Neuroscience.
  • Tang, X., Liu, M., Zhao, H., Tao, W. (2009). Leaf Extraction from Complicated Background, IEEE International Congress on Image and Signal Processing, pp. 1–5.
  • Van der Heijden, G., Song, Y., Horgan, G., Polder, G., Dieleman, A., Bink, M., Palloix, A., van Eeuwijk, F., Glasbey, C. (2012). SPICY: Towards Automated Phenotyping of Large Pepper Plants in the Greenhouse, Functional Plant Biology 39 (11), 870–877
  • WEBOPEDIA, Webopedia Online Dictionary for Computer and Internet Terms [online, access date June 17, 2020].
  • White, J.W., Andrade-Sanchez, P., Gore, M.A., Bronson, K.F., Coffelt, T.A., Conley, M.M., Feldmann, K.A., French, A.N., Heun,
  • J.T., Hunsaker, D.J., Jenks, M.A., Kimball, B.A., Roth, R.L., Strand, R.J., Thorp, K.R.,Wall, G.W.,Wang, G. (2012). Field-based Phenomics for Plant Genetics Research, Field Crops Research 133, 101–112.

Convolutional Neural Networks for Image-Based Digital Plant Phenotyping

Yıl 2020, Ejosat Özel Sayı 2020 (HORA), 338 - 342, 15.08.2020
https://doi.org/10.31590/ejosat.780087

Öz

Plants are one of the most important components of the environment. Millions of people are undernourished because of global warming whose adverse effects such as drought has made it difficult for sustainable crop breeding programs. This paper is aimed to propose and test computer vision and machine learning image-based methods precisely convolutional neural networks; for a benchmark suggested by the International Plant Phenotyping Network to help researchers, plant breeders choose desirable crop traits, and link them to specific genes that helped in the production of viable plants that could withstand harsher environmental conditions. Also as a first of its kind in Turkey and its environ, this paper is aimed to provide a ground base for future research in this area of agriculture. The benchmark chosen is the classification of mutants’ benchmark (plant disease detection). In this paper, the dataset chosen was two of the main cash crops that can be found in Turkey were used: Maize and Grapes. Three different plant diseases affecting Grape and Maize were used respectively and a class of healthy grape and maize annotated images were added amounting to a total of 8 different classes and 1600 annotated images for both training and testing for the custom convolutional neural network to be proposed. The results show that the custom model achieved 97.03 % accuracy on the test dataset after training. The research thus concluded that, the custom model performed better than most currently used convolutional neural network models and can be used as a basis for further research in the field of image detection.

Kaynakça

  • Bioinformatics: The Machine Learning Approach, Pierre Baldi and Søren Brunak
  • Brahimi M, Boukhalfa K, Moussaoui A (2017). Deep Learning for Tomato Diseases: Classification and Symptoms Visualization. Applied Artificial Intelligence
  • D. F. Specht (1988). Probabilistic Neural Networks for Classification Mapping, or Associative Memory, IEEE International Conference on Neural Networks, vol. 1.
  • D. Gavrila and V. Philomin (1999). Real-time Object Detection for Smart Vehicles,” Computer Vision, 1999. The Proceedings of the Seventh IEEE International Conference on, vol. 1.
  • D. Lowe (1999) Object Recognition from Local Scale-invariant Features, IEEE International Conference on Computer Vision, vol. 2, pp. 1150–1157.
  • Glorot, X., and Bengio, Y. (2010). Understanding the Difficulty of Training Deep Feedforward Neural Networks, International Conference on Artificial Intelligence and Statistics, Society for Artificial Intelligence and Statistics.
  • Großkinsky, D. K., Svensgaard, J., Christensen, S., and Roitsch, T. (2015). Plant Phenomics and the Need for Physiological Phenotyping Across Scales to Narrow the Genotype-to-phenotype Knowledge Gap, J. Exp. Bot. 66, 5429–5440.
  • Hartmann, A., Czauderna, T., Hoffmann, R., Stein, N., and Schreiber, F. (2011). HTPheno: An Image Analysis Pipeline for High-Throughput Plant Phenotyping. BMC Bioinformatics 12:148.
  • Jordan Ubbens, and Ian Stavness (2017). Deep Plant Phenomics: A Deep Learning Platform for Complex Plant Phenotyping Tasks’.
  • Kawasaki Y, Uga H, Kagiwada S, Iyatomi H (2015) Basic Study of Automated Diagnosis of Viral Plant Diseases Using Convolutional Neural Networks. International Symposium on Advances in Visual Computing, NV, USA, pp. 63-86.
  • Kingma, D. P., and Ba, J. L. (2015). “Adam: A Method for Stochastic Optimization, International Conference on Learning Representations, San Diego, CA, pp. 1–15.
  • Klukas, C., Chen, D., and Pape, J.-M. (2014). Integrated Analysis Platform: An Open-source Information System for High-throughput Plant Phenotyping. Plant Physiology, 165, pp. 506–518.
  • Kokorian, J., Polder, G., Keurentjes, J.J.B., Vreugdenhil, D., Olortegui Guzman, M., (2010). An Image-based Measurement Setup for Automated Phenotyping of Plants. International Conference on Image User and Developer, pp. 178–182.
  • M. T. Hagan, H. B. Demut, and M. H. Beale (2002). Neural Network Design.
  • Önder K, Mehmet Ö. (2016). Farklı Lokasyonlarda Yetişen Yoncanın Bazı Fenotip Özelliklerinin Görüntü İşleme Yöntemi ile Belirlenmesi’’, Akdeniz Üniversitesi Ziraat Fakültesi Dergisi, 28:2.
  • Sladojevic S, Arsenovic M, Anderla A, Culibrk D, Stefanovic D (2016) Deep Neural Networks Based Recognition of Plant Diseases by Leaf Image Classification, Computational Intelligence and Neuroscience.
  • Tang, X., Liu, M., Zhao, H., Tao, W. (2009). Leaf Extraction from Complicated Background, IEEE International Congress on Image and Signal Processing, pp. 1–5.
  • Van der Heijden, G., Song, Y., Horgan, G., Polder, G., Dieleman, A., Bink, M., Palloix, A., van Eeuwijk, F., Glasbey, C. (2012). SPICY: Towards Automated Phenotyping of Large Pepper Plants in the Greenhouse, Functional Plant Biology 39 (11), 870–877
  • WEBOPEDIA, Webopedia Online Dictionary for Computer and Internet Terms [online, access date June 17, 2020].
  • White, J.W., Andrade-Sanchez, P., Gore, M.A., Bronson, K.F., Coffelt, T.A., Conley, M.M., Feldmann, K.A., French, A.N., Heun,
  • J.T., Hunsaker, D.J., Jenks, M.A., Kimball, B.A., Roth, R.L., Strand, R.J., Thorp, K.R.,Wall, G.W.,Wang, G. (2012). Field-based Phenomics for Plant Genetics Research, Field Crops Research 133, 101–112.
Toplam 21 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Mühendislik
Bölüm Makaleler
Yazarlar

Tolga Ensari Bu kişi benim 0000-0003-0896-3058

Dariel Courage Armah Bu kişi benim

Ahmet Emre Balsever Bu kişi benim 0000-0002-3655-1571

Mustafa Dağtekin 0000-0002-0797-9392

Yayımlanma Tarihi 15 Ağustos 2020
Yayımlandığı Sayı Yıl 2020 Ejosat Özel Sayı 2020 (HORA)

Kaynak Göster

APA Ensari, T., Armah, D. C., Balsever, A. E., Dağtekin, M. (2020). Convolutional Neural Networks for Image-Based Digital Plant Phenotyping. Avrupa Bilim Ve Teknoloji Dergisi338-342. https://doi.org/10.31590/ejosat.780087