Araştırma Makalesi
BibTex RIS Kaynak Göster

TÜRKİYE'DEKİ KELEBEK TÜRLERİNİN BASAMAKLI EVRİŞİMLİ SİNİR AĞLARI İLE SINIFLANDIRILMASI

Yıl 2021, Cilt: 9 Sayı: 3, 568 - 587, 01.09.2021
https://doi.org/10.36306/konjes.844847

Öz

Kelebekler ekosistemdeki değişikliklere hızlı bir şekilde yanıt verebilme özelliğine sahiptir. Ayrıca çoğu kelebek türü larvaları, insan ve hayvanların yaşam ortamını ve gıda kaynaklarını etkileyen tarım ve orman zararlılarıdır. Bu nedenle kelebek türlerinin sınıflandırılması, tür araştırmalarının yanı sıra çevre koruma, tarım ve orman zararlılarının kontrolünde de önemlidir. Bu çalışmada Türkiye’deki 9 aile ve 416 kelebek türünü sınıflandırmak için yedi adet evrişimli sinir ağı transfer öğrenme yöntemiyle kullanılmıştır. Veri seti oluşturmak için 13528 görüntü toplanmış, veri artırma yöntemi ile görüntü sayısı 67640’a çıkarılmıştır. Eğitimde ezberlemenin önüne geçebilmek, ağların performansını ve güvenirliliğini artırmak için Stratified Shuffle Split, K fold cross validation yöntemleri kullanılmıştır. Tür sayısının fazlalığı, türlerin desen ve renk benzerliği nedeniyle ağların düşük başarı oranını artırmak için iki basamaklı ağ modeli kullanılmıştır. Modelde birinci basamakta bir, ikinci basamakta paralel bağlı dokuz ağ vardır. Birinci basamaktaki ailelere göre sınıflandırmada %95.88, ikinci basamaktaki tür sınıflandırmada ise %91.99 ile %100 arasında başarı oranı elde edilmiştir.

Kaynakça

  • Almryad, A.S., Kutucu, H., 2020, “Automatic identification for field butterflies by convolutional neural networks”, Engineering Science and Technology an International Journal, Vol. 23, No. 1, pp. 189–195.
  • Antequera, M.L., Vallina, M.L., Strisciuglio, N., Petkov, N., 2019, “Place and Object Recognition by CNN- Based COSFIRE Filters”, IEEE Access, Vol 7, pp. 66157–66166.
  • Arzar, N.N.K., Sabri, N., Johari, N.F.M., Shari, A.A., Noordin, M.R.M., Ibrahim, S., 2019, “Butterfly Species Identification Using Convolutional Neural Network (CNN)”, 2019 IEEE International Conference on Automatic Control and Intelligent Systems (I2CACIS), Shah Alam-Malaysia, 221–224.
  • Baykal, E., Doğan, H., Ercin, M.E., Ersoz, S., Ekinci, M., 2019, “Transfer learning with pre-trained deep convolutional neural networks for serous cell classification”, Multimedia Tools and Applications, pp. 1-19.
  • Brownle, J., “Machine Learning Mastery. A Gentle Introduction to Transfer Learning for Deep Learning”, https://machinelearningmastery.com/transfer-learning-for-deep-learning/, ziyaret tarihi: 11 Kasım 2020.
  • Carvajal, J.A., Romero, D., Sappa, A.D., 2016, “Fine-tuning based deep convolutional networks for lepidopterous genus recognition”, In Proceedings of the Iberoamerican Congress on Pattern Recognition, Lima-Peru, 467–475.
  • Cebeci, Z., http://www.adamerkelebek.org/MenuDetay.asp?MenuId=21&altMenuId=61, ziyaret tarihi: 11 Kasım 2020.
  • Chang, Q., Qu, H., Wu, P., Yi, J., 2017, “Fine-Grained butterfly and moth classification using deep convolutional neural networks”, Semantic Scholar.
  • Chollet F., 2017, Deep Learning with Python, Manning Publications, New York, United States.
  • Github, “Cross-Validation: Why and how to do it”, https://srikarvaka.github.io/model-evaluation/Cross- validation/, ziyaret tarihi: 12 Ekim 2020.
  • Goodfellow, I., Bengio, Y., Courville, A., 2015, Deep Learning, The MIT Press, Cambridge, Massachusetts, United States.
  • Gürkan, H., Hanilçi, A., 2020 “Evrişimsel sinir ağı ve QRS imgeleri kullanarak EKG tabanlı biyometrik tanıma yöntemi”, Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, Vol. 26, No. 2, pp. 318-327.
  • He, K., Zhang, X., Ren, S., Sun, J., 2016, “Deep Residual Learning for Image Recognition”, IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, United States, 770-778.
  • Indolia, S., Goswami, A.K., Mishra, S.P., Asopa, P., 2018, “Conceptual Understanding of Convolutional Neural Network-A Deep Learning Approach”, Procedia Computer Science, Vol. 132, pp. 679-688.
  • Kaggle, “Stratified Shuffle Split Working with less data”, https://www.kaggle.com/viswanathanc/stratifiedshufflesplit-working-with-less-data, ziyaret tarihi: 12 Ekim 2020.
  • Kang, S.H., Cho, J.H., Lee, S.H., 2014, “Identification of butterfly based on their shapes when viewed from different angles using an artificial neural network”, Journal of Asia-Pacific Entomology, Vol. 17, No. 2, pp. 143–149.
  • Karaçetin, E., Welch, H.J., 2011, Türkiye’deki Kelebeklerin Kırmızı Kitabı, Doğa Koruma Merkezi, Ankara, Türkiye.
  • Karaçetin, E., Welch, H.J., Turak, A., Balkız, Ö., Welch, G., 2011, Türkiye’deki Kelebeklerin Koruma Stratejisi, Doğa Koruma Merkezi, Ankara, Türkiye.
  • Kaya, Y., Ertuğrul, Ö.F., Tekin, R., 2015, “Two novel local binary pattern descriptors for texture analysis”, Applied Soft Computing, Vol. 34, pp. 728–735.
  • Kaya, Y., Kayci, L., Uyar, M., 2015, “Automatic identification of butterfly species based on local binary patterns and artificial neural network”,Applied Soft Computing, Vol. 28, pp. 132–137.
  • Kelebek-Turk, “Türkiye’nin Kelebekleri”, https://www.kelebek-turk.com/families.php, ziyaret tarihi: 11 Kasım 2020.
  • Lawrence, S., Giles, C.L., Tsoi, A.C., Back, A.D., 1997, “Face recognition: A convolutional neural-network approach”, IEEE Transactions on Neural Networks, Vol. 8, No. 1, pp. 98-113.
  • Lin, Z., Jia, J., Gao, W., Huang, F., 2019, “Fine-grained visual categorization of butterfly specimens at sub- species level via a convolutional neural network with skip-connections”, Neurocomputing, Vol. 384, pp. 295–313.
  • Mash, R., Borghetti, B., Pecarina, J., 2016, “Improved Aircraft Recognition for Aerial Refueling through Data Augmentation in Convolutional Neural Networks”, 12th International Symposium on Visual Computing Advances in Visual Computing, Las Vegas, Vol. 12, pp. 113-122.
  • MathWorks, https://www.mathworks.com/help/deeplearning/ug/introduction-to-convolutional-neural- networks.html, ziyaret tarihi: 11 Kasım 2020.
  • Medium, “StratifiedKFold v.s KFold v.s”, https://xzz201920.medium.com/stratifiedkfold-v-s-kfold-v-s- stratifiedshufflesplit-ffcae5bfdf, ziyaret tarihi: 12 Ekim 2020.
  • Ming, Z.A., Peng, M.P., Yu, X.T., Ning, W.J., Jin, F., Zhong, S.Z., Lei, T.Y., Qing, Y., 2017, “Automatic identification of butterfly specimen images at the family level based on deep learning method”, Acta Entomologica Sinica, Vol. 60, pp. 1339–1348.
  • Nebauer, C., 1998, “Evaluation of convolutional neural networks for visual recognition”, IEEE Transactions on Neural Networks, Vol. 9, No. 4, pp. 685- 696.
  • Nie, L., Wang, K., Fan, X., Gao, Y., “Fine-grained butterfly recognition with deep residual networks: A new baseline and benchmark”, 2017 International Conference on Digital Image Computing: Techniques and Applications (DICTA), Sydney-Australia, 1–7, 29 November-01 December 2017. Pan, S.J., Yang, Q., 2010, “A survey on transfer learning”, IEEE Transactions on Knowledge and Data Engineering, Vol. 22, No. 10, pp. 1345-1359.
  • Petit, J.C., “Inventory of the butterfly species of sangay national park–Ecuador”, http://www.sangay.eu/index.php?lang=en, ziyaret tarihi: 11 Kasım 2020 Rodner, E., Simon, M., Brehm, G., Pietsch, S., Wägele, J.W., Denzler, J., 2015, “Fine-grained recognition datasets for biodiversity analysis”, http://www.inf-cv.uni-jena.de/fgvcbiodiv, ziyaret tarihi: 11 Kasım 2020.
  • Ruoyan, Z., Cuixia, L., Shuai, Y., Xinru, F., 2019, “Butterfly Recognition Based on Faster R-CNN”, Journal of Physics: Conference Series, Vol. 1176, No.3.
  • Serna, A.H., Segura, L.F.J., 2014, “Automatic identification of species with neural networks”, PeerJ, e563.
  • Shou, J., Zhou, Y., Li, Y., 2006, “Systematic Butterfly Names of the World”, Shaanxi Science and Technology Press, Xi’an, China.
  • Theckedath, D., Sedamkar, R.R., 2020, “Detecting Affect States Using VGG16, ResNet50 and SE‑ResNet50 Networks”, Springer Nature Computer Science, Vol. 79, pp. 1-7.
  • Trakel, “İl tür sayıları”, http://www.trakel.org/kelebekler/?fsx=il-tur-listesi, ziyaret tarihi: 11 Kasım 2020.
  • Wang, J., Markert, K., Everingham, M., 2009, “Learning models for object recognition from natural language descriptions”, In Proceedings of the 20th British Machine Vision Conference, London, UK.
  • Wang, W., Zhang, J., Wang, F., 2019, “Attention bilinear pooling for fine-grained classification”, Symmetry, Vol. 11, No. 8, p. 1033.
  • Xie, J., Hou, Q., Shi, Y., Peng, L., Jing, L., Zhuang, F., Zhang, J., Tang, X., Xu, S., 2018, “The automatic identification of butterfly species”, Journal of Computer Research and Development, Vol. 55, No. 8, pp. 1609-1618.
  • Xin, D., Chen, Y.W., Li, J., 2020, “Fine-Grained Butterfly Classification in Ecological Images Using Squeeze- And-Excitation and Spatial Attention Modules”, Applied Sciences, Vol. 10, No. 5, p. 1681.
  • Xue, A., Li, F., Xiong, Y., 2019, “Automatic Identification of Butterfly Species Based on Gray-Level Co-occurrence Matrix Features of Image Block”, Journal of Shanghai Jiaotong University (Science), Vol. 24, No. 2, pp. 220-225.
  • Yamashita, R., Nishio, M., Do, R.K.G., Togashi, K., 2018, “Convolutional neural networks: an overview and application in radiology”, Insights Imaging, Vol. 9, pp. 611–629.
  • Ye, H., Han, H., Zhu, L., Duan, Q., 2019, “Vegetable pest image recognition method based on improved VGG convolution neural network”, Journal of Physics: Conference Series, Vol. 1237, No. 3, 032018.
  • Zhao, R., Li, C., Ye, S., Fang, X., 2019, “Butterfly Recognition Based on Faster R-CNN”, Journal of Physics: Conference Series, Vol. 1176, No. 3.
  • Zhou, Y., Nejati, H., Do, T.T., Cheung, N.M., Cheah, L., 2016, “Image-based vehicle analysis using deep neural network: A systematic study”, IEEE international conference on digital signal processing, 276-280.
  • Zhu, L., Spachos, P., 2019, “Butterfly Classification with Machine Learning Methodologies for an Android Application”, 2019 IEEE Global Conference on Signal and Information Processing (GlobalSIP), Ottawa-Canada.

Classification of Butterfly Species in Turkey with Cascaded Convolutional Neural Networks

Yıl 2021, Cilt: 9 Sayı: 3, 568 - 587, 01.09.2021
https://doi.org/10.36306/konjes.844847

Öz

Butterflies have the ability to respond quickly to changes in the ecosystem. In addition, most butterfly species larvae are agricultural and forest pests that affect the habitats and food resources of humans and animals. Therefore, classification of butterfly species is important in environmental protection, agriculture and forest pest control as well as species research. In this study, seven convolutional neural network transfer learning methods were used to classify 9 families and 416 butterfly species in Turkey. In order to create a dataset, 13528 images were collected, and the number of images was increased to 67640 by data augmentation method. Stratified Shuffle Split, K fold cross validation methods were used to prevent memorization and increase the performance and reliability of networks. A cascaded network model was used to increase the low success rate of networks due to the excess number of species, the pattern and color similarity of species. In the model, there is one network on the first layer and nine networks connected in parallel on the second layer. A success rate of 95.88% was achieved in the classification according to families in the first layer and 91.99% to 100% in the classification of species in the second layer.

Kaynakça

  • Almryad, A.S., Kutucu, H., 2020, “Automatic identification for field butterflies by convolutional neural networks”, Engineering Science and Technology an International Journal, Vol. 23, No. 1, pp. 189–195.
  • Antequera, M.L., Vallina, M.L., Strisciuglio, N., Petkov, N., 2019, “Place and Object Recognition by CNN- Based COSFIRE Filters”, IEEE Access, Vol 7, pp. 66157–66166.
  • Arzar, N.N.K., Sabri, N., Johari, N.F.M., Shari, A.A., Noordin, M.R.M., Ibrahim, S., 2019, “Butterfly Species Identification Using Convolutional Neural Network (CNN)”, 2019 IEEE International Conference on Automatic Control and Intelligent Systems (I2CACIS), Shah Alam-Malaysia, 221–224.
  • Baykal, E., Doğan, H., Ercin, M.E., Ersoz, S., Ekinci, M., 2019, “Transfer learning with pre-trained deep convolutional neural networks for serous cell classification”, Multimedia Tools and Applications, pp. 1-19.
  • Brownle, J., “Machine Learning Mastery. A Gentle Introduction to Transfer Learning for Deep Learning”, https://machinelearningmastery.com/transfer-learning-for-deep-learning/, ziyaret tarihi: 11 Kasım 2020.
  • Carvajal, J.A., Romero, D., Sappa, A.D., 2016, “Fine-tuning based deep convolutional networks for lepidopterous genus recognition”, In Proceedings of the Iberoamerican Congress on Pattern Recognition, Lima-Peru, 467–475.
  • Cebeci, Z., http://www.adamerkelebek.org/MenuDetay.asp?MenuId=21&altMenuId=61, ziyaret tarihi: 11 Kasım 2020.
  • Chang, Q., Qu, H., Wu, P., Yi, J., 2017, “Fine-Grained butterfly and moth classification using deep convolutional neural networks”, Semantic Scholar.
  • Chollet F., 2017, Deep Learning with Python, Manning Publications, New York, United States.
  • Github, “Cross-Validation: Why and how to do it”, https://srikarvaka.github.io/model-evaluation/Cross- validation/, ziyaret tarihi: 12 Ekim 2020.
  • Goodfellow, I., Bengio, Y., Courville, A., 2015, Deep Learning, The MIT Press, Cambridge, Massachusetts, United States.
  • Gürkan, H., Hanilçi, A., 2020 “Evrişimsel sinir ağı ve QRS imgeleri kullanarak EKG tabanlı biyometrik tanıma yöntemi”, Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, Vol. 26, No. 2, pp. 318-327.
  • He, K., Zhang, X., Ren, S., Sun, J., 2016, “Deep Residual Learning for Image Recognition”, IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, United States, 770-778.
  • Indolia, S., Goswami, A.K., Mishra, S.P., Asopa, P., 2018, “Conceptual Understanding of Convolutional Neural Network-A Deep Learning Approach”, Procedia Computer Science, Vol. 132, pp. 679-688.
  • Kaggle, “Stratified Shuffle Split Working with less data”, https://www.kaggle.com/viswanathanc/stratifiedshufflesplit-working-with-less-data, ziyaret tarihi: 12 Ekim 2020.
  • Kang, S.H., Cho, J.H., Lee, S.H., 2014, “Identification of butterfly based on their shapes when viewed from different angles using an artificial neural network”, Journal of Asia-Pacific Entomology, Vol. 17, No. 2, pp. 143–149.
  • Karaçetin, E., Welch, H.J., 2011, Türkiye’deki Kelebeklerin Kırmızı Kitabı, Doğa Koruma Merkezi, Ankara, Türkiye.
  • Karaçetin, E., Welch, H.J., Turak, A., Balkız, Ö., Welch, G., 2011, Türkiye’deki Kelebeklerin Koruma Stratejisi, Doğa Koruma Merkezi, Ankara, Türkiye.
  • Kaya, Y., Ertuğrul, Ö.F., Tekin, R., 2015, “Two novel local binary pattern descriptors for texture analysis”, Applied Soft Computing, Vol. 34, pp. 728–735.
  • Kaya, Y., Kayci, L., Uyar, M., 2015, “Automatic identification of butterfly species based on local binary patterns and artificial neural network”,Applied Soft Computing, Vol. 28, pp. 132–137.
  • Kelebek-Turk, “Türkiye’nin Kelebekleri”, https://www.kelebek-turk.com/families.php, ziyaret tarihi: 11 Kasım 2020.
  • Lawrence, S., Giles, C.L., Tsoi, A.C., Back, A.D., 1997, “Face recognition: A convolutional neural-network approach”, IEEE Transactions on Neural Networks, Vol. 8, No. 1, pp. 98-113.
  • Lin, Z., Jia, J., Gao, W., Huang, F., 2019, “Fine-grained visual categorization of butterfly specimens at sub- species level via a convolutional neural network with skip-connections”, Neurocomputing, Vol. 384, pp. 295–313.
  • Mash, R., Borghetti, B., Pecarina, J., 2016, “Improved Aircraft Recognition for Aerial Refueling through Data Augmentation in Convolutional Neural Networks”, 12th International Symposium on Visual Computing Advances in Visual Computing, Las Vegas, Vol. 12, pp. 113-122.
  • MathWorks, https://www.mathworks.com/help/deeplearning/ug/introduction-to-convolutional-neural- networks.html, ziyaret tarihi: 11 Kasım 2020.
  • Medium, “StratifiedKFold v.s KFold v.s”, https://xzz201920.medium.com/stratifiedkfold-v-s-kfold-v-s- stratifiedshufflesplit-ffcae5bfdf, ziyaret tarihi: 12 Ekim 2020.
  • Ming, Z.A., Peng, M.P., Yu, X.T., Ning, W.J., Jin, F., Zhong, S.Z., Lei, T.Y., Qing, Y., 2017, “Automatic identification of butterfly specimen images at the family level based on deep learning method”, Acta Entomologica Sinica, Vol. 60, pp. 1339–1348.
  • Nebauer, C., 1998, “Evaluation of convolutional neural networks for visual recognition”, IEEE Transactions on Neural Networks, Vol. 9, No. 4, pp. 685- 696.
  • Nie, L., Wang, K., Fan, X., Gao, Y., “Fine-grained butterfly recognition with deep residual networks: A new baseline and benchmark”, 2017 International Conference on Digital Image Computing: Techniques and Applications (DICTA), Sydney-Australia, 1–7, 29 November-01 December 2017. Pan, S.J., Yang, Q., 2010, “A survey on transfer learning”, IEEE Transactions on Knowledge and Data Engineering, Vol. 22, No. 10, pp. 1345-1359.
  • Petit, J.C., “Inventory of the butterfly species of sangay national park–Ecuador”, http://www.sangay.eu/index.php?lang=en, ziyaret tarihi: 11 Kasım 2020 Rodner, E., Simon, M., Brehm, G., Pietsch, S., Wägele, J.W., Denzler, J., 2015, “Fine-grained recognition datasets for biodiversity analysis”, http://www.inf-cv.uni-jena.de/fgvcbiodiv, ziyaret tarihi: 11 Kasım 2020.
  • Ruoyan, Z., Cuixia, L., Shuai, Y., Xinru, F., 2019, “Butterfly Recognition Based on Faster R-CNN”, Journal of Physics: Conference Series, Vol. 1176, No.3.
  • Serna, A.H., Segura, L.F.J., 2014, “Automatic identification of species with neural networks”, PeerJ, e563.
  • Shou, J., Zhou, Y., Li, Y., 2006, “Systematic Butterfly Names of the World”, Shaanxi Science and Technology Press, Xi’an, China.
  • Theckedath, D., Sedamkar, R.R., 2020, “Detecting Affect States Using VGG16, ResNet50 and SE‑ResNet50 Networks”, Springer Nature Computer Science, Vol. 79, pp. 1-7.
  • Trakel, “İl tür sayıları”, http://www.trakel.org/kelebekler/?fsx=il-tur-listesi, ziyaret tarihi: 11 Kasım 2020.
  • Wang, J., Markert, K., Everingham, M., 2009, “Learning models for object recognition from natural language descriptions”, In Proceedings of the 20th British Machine Vision Conference, London, UK.
  • Wang, W., Zhang, J., Wang, F., 2019, “Attention bilinear pooling for fine-grained classification”, Symmetry, Vol. 11, No. 8, p. 1033.
  • Xie, J., Hou, Q., Shi, Y., Peng, L., Jing, L., Zhuang, F., Zhang, J., Tang, X., Xu, S., 2018, “The automatic identification of butterfly species”, Journal of Computer Research and Development, Vol. 55, No. 8, pp. 1609-1618.
  • Xin, D., Chen, Y.W., Li, J., 2020, “Fine-Grained Butterfly Classification in Ecological Images Using Squeeze- And-Excitation and Spatial Attention Modules”, Applied Sciences, Vol. 10, No. 5, p. 1681.
  • Xue, A., Li, F., Xiong, Y., 2019, “Automatic Identification of Butterfly Species Based on Gray-Level Co-occurrence Matrix Features of Image Block”, Journal of Shanghai Jiaotong University (Science), Vol. 24, No. 2, pp. 220-225.
  • Yamashita, R., Nishio, M., Do, R.K.G., Togashi, K., 2018, “Convolutional neural networks: an overview and application in radiology”, Insights Imaging, Vol. 9, pp. 611–629.
  • Ye, H., Han, H., Zhu, L., Duan, Q., 2019, “Vegetable pest image recognition method based on improved VGG convolution neural network”, Journal of Physics: Conference Series, Vol. 1237, No. 3, 032018.
  • Zhao, R., Li, C., Ye, S., Fang, X., 2019, “Butterfly Recognition Based on Faster R-CNN”, Journal of Physics: Conference Series, Vol. 1176, No. 3.
  • Zhou, Y., Nejati, H., Do, T.T., Cheung, N.M., Cheah, L., 2016, “Image-based vehicle analysis using deep neural network: A systematic study”, IEEE international conference on digital signal processing, 276-280.
  • Zhu, L., Spachos, P., 2019, “Butterfly Classification with Machine Learning Methodologies for an Android Application”, 2019 IEEE Global Conference on Signal and Information Processing (GlobalSIP), Ottawa-Canada.
Toplam 45 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Mühendislik
Bölüm Araştırma Makalesi
Yazarlar

Bahadır Elmas 0000-0002-8732-9997

Yayımlanma Tarihi 1 Eylül 2021
Gönderilme Tarihi 21 Aralık 2020
Kabul Tarihi 7 Mayıs 2021
Yayımlandığı Sayı Yıl 2021 Cilt: 9 Sayı: 3

Kaynak Göster

IEEE B. Elmas, “TÜRKİYE’DEKİ KELEBEK TÜRLERİNİN BASAMAKLI EVRİŞİMLİ SİNİR AĞLARI İLE SINIFLANDIRILMASI”, KONJES, c. 9, sy. 3, ss. 568–587, 2021, doi: 10.36306/konjes.844847.