Araştırma Makalesi
BibTex RIS Kaynak Göster

Kümes Çevresindeki Tilki Tespiti için Derin Öğrenme ve Yolov8 Modellerinin Karşılaştırılması

Yıl 2024, Cilt: 5 Sayı: 2, 76 - 90, 20.12.2024
https://doi.org/10.58769/joinssr.1498561

Öz

Yüzyıllardır tarım ve hayvancılıkla uğraşan insanoğlu, sürekli olarak kendi tarım arazilerini ve hayvanlarını takip etmek, bakımını yapmak ve sürdürmek zorundadır. Bu da sürekli emek ve zaman gerektirir. Bu çalışmanın amacı ve özgünlüğü, kümes hayvancılığı yapan bireylerin kümeslerinde yaşayan tavuk, kaz, ördek, hindi gibi hayvanları tehdit eden, onlara zarar veren veya kaçıran tilkilerin tespit edilmesidir. Bu sayede çiftçi o an kümeste olmasa dahi çiftçinin maddi ve manevi kayıplarının önüne geçilmiş olacaktır. Bu amaca ulaşmak için birçok görsel toplanarak veri seti oluşturuldu. Toplanan veri seti tilkinin kümeste olup olmamasına göre sınıflandırıldı. Daha sonra kümesteki tilki varlığını tespit etmek amacıyla transfer öğrenmede gerçekleştirilecek şekilde DenseNet, MobileNet, ResNet50, VGG16, VGG19, Xception ve Yolov8 mimarilerinin çıkışlarına ince ayar yapıldı. Daha sonra modeller eğitilerek performansları kayıp, doğruluk, hassasiyet ve F1 gibi performans metrikleri açısından karşılaştırıldı. Sonuçlara göre Yolov8 mimarileri genel olarak en iyi performansı sergilemiştir.

Kaynakça

  • [1] Ana Britannica Genel Kültür Ansiklopedisi . Ana yayıncılık , 2004.
  • [2] E. Koçak, “Düzce İli’nin CBS ve Uzaktan Algılama Tabanlı Tarım Coğrafyası,” PhD Thesis, Karabük Üniversitesi / Lisansüstü Eğitim Enstitüsü , Karabük, 2023.
  • [3] “TİGEM Hayvancılık Sektör Raporu,” 2020.
  • [4] A. Şekeroğlu and M. Sarıca, “Bir üretim sistemi olarak köy tavukçuluğu,” avukçuluk Araştırma Dergisi, vol. 9, no. 1, 2010.
  • [5] A. Özçağlar, Coğrafyaya giriş. Ümit Ofset Matbaacılık, 2014.
  • [6] M. Demirhan, “Erciş ilçesinde tarım ve hayvancılık faaliyetleri,” Atatürk Üniversitesi / Sosyal Bilimler Enstitüsü / Coğrafya Ana Bilim Dalı, 2023.
  • [7] H. Inci, R. Bural, and T. Şengül, “Bingöl İli Köy Tavukçuluğunun Yapısı,” Tavukçuluk Araştırma Dergisi, vol. 12, no. 2, pp. 13–17, 2015.
  • [8] M. Türkoğlu and H. Eleroğlu, “Serbest broiler yetiştiriciliği,” VIV Poultry Yutav, pp. 3–6, 1999.
  • [9] P. M. Kristjanson, A. Krishna, M. A. Radeny, and W. Nindo, “Pathways out of Povert in Western Kenya and the Role of Livestock,” Pro-poor livestock policy initiative working paper, 2004.
  • [10] J. C. Riise, A. Permin, C. V. Mcainsh, and L. Frederiksen, “Keeping village poultry a technical manual on small-scale poultry production,” Network for Small holder Poultry Development, 2004.
  • [11] J. W. Copland and R. G. Alders, “Australian village poultry development program in Asia and Africa,” Worlds Poult Sci J, pp. 61–1, 2005.
  • [12] TUIK, “Merkezi Dağıtım Sistemi,” https://biruni.tuik.gov.tr/medas/?kn=101&locale=tr.
  • [13] C. Tepeli and E. Arslan, Hayvan Yetiştiriciliğinde Temel Yaklaşımlar. Duvar Yayınları, 2023.
  • [14] A. Akgul, Y. Karaca, M. A. Pala, M. Çimen, A. F. Boz, and M. Z. Yildiz, “chaos theory, advanced metaheuristic algorithms and their newfangled deep learning architecture optimization applications: a review,” Fractals, vol. 32, no. 3, 2024.
  • [15] L. Chen, H. Zhang, J. Xiao, L. Nie, J. Shao, and W. Liu, “SCA-CNN : Spatial and Channel-wise Attention in Convolutional Networks for Image Captioning,” pp. 5659–5667.
  • [16] V. K. Chauhan, S. Singh, and A. Sharma, “HCR-Net: A deep learning based script independent handwritten character recognition network,” Multimed Tools Appl, pp. 1–35.
  • [17] C. Hema and F. P. G. Marquez, “Emotional speech recognition using cnn and deep learning techniques,” Applied Acoustics, vol. 211, p. 109492, 2023.
  • [18] M. A. Morid, O. R. L. Sheng, and J. Dunbar, “Time series prediction using deep learning methods in healthcare,” ACM Trans Manag Inf Syst, vol. 14, no. 1, pp. 1–29, 2023.
  • [19] O. T. Bişkin, “Multi-Step Forecasting Of Covid-19 Cases In European Countries Using Temporal Convolutional Networks,” Mugla Journal of Science and Technology, vol. 7, no. 1, pp. 117–126, 2021.
  • [20] S. Mukherjee, B. Sadhukhan, N. Sarkar, D. Roy, and S. De, “Stock market prediction using deep learning algorithms,” CAAI Trans Intell Technol, vol. 8, no. 1, pp. 82–94, 2023.
  • [21] Güven. E., M. Ersoy, and G. Çetin, “Brain Tumor Segmentation On Flair MR Images with U-Net,” Mugla Journal of Science and Technology, vol. 9, no. 1, pp. 34–41, 2023.
  • [22] P. P. I. Prasetyo, W. T. Sesulihatien, and A. R. Barakbah, “Behavioral Discrimination in Chicken Flocks: A Camera-Based Movement Analysis for Distinguishing Normal and Scared Behaviors,” Indonesian Journal of Computer Science, vol. 13, no. 1, 2024.
  • [23] R. Sasirekha, R. Kaviya, G. Saranya, A. Mohamed, and U. Iroda, “Smart Poultry House Monitoring System Using IoT,” in In E3S Web of Conferences, 2023.
  • [24] T. Diwan, G. Anirudh, and J. V. Tembhurne, “Object detection using YOLO: Challenges, architectural successors, datasets and applications,” Multimed Tools Appl, vol. 82, no. 6, 2023.
  • [25] K. Erin, B. Bingöl, and B. Boru, “YOLO–Based Waste Detection,” Journal of Smart Systems Research, vol. 3, no. 2, pp. 120–127, 2022.
  • [26] N. Jmour, S. Zayen, and A. Abdelkrim, “Convolutional neural networks for image classification,” in In 2018 international conference on advanced systems and electric technologies (IC_ASET), 2018, pp. 397–402.
  • [27] E. Şafak and N. Barışçı, “Real-time fire and smoke detection for mobile devices using deep learning,” Journal of the Faculty of Engineering and Architecture of Gazi University, vol. 38, no. 4, pp. 2179–2190, 2023.
  • [28] F. Eryılmaz and H. Karacan, “Akciğer X-Ray Görüntülerinden COVID-19 Tespitinde Hafif ve Geleneksel Evrişimsel Sinir Ağ Mimarilerinin Karşılaştırılması,” Düzce Üniversitesi Bilim ve Teknoloji Dergisi, vol. 9, no. 6, pp. 16–39, 202AD.
  • [29] Z. Yücel and D. Çetintaş, “YOLOV9 İle Kan Hücrelerinin Otomatik Tanımlanması: Optimizasyon Ve Öğrenme Oranı Etkileri,” Adıyaman Üniversitesi Mühendislik Bilimleri Dergisi, vol. 22, pp. 125–135, 2024.
  • [30] F. Kumral and A. Küçükmanisa, “Temporal Analysis Based Driver Drowsiness Detection System Using Deep Learning Approaches,” Sakarya University Journal of Science, vol. 26, no. 4, pp. 710–719, 2022.
  • [31] S. Dereli, M. Okuyar, and E. Güney, “A conceptual system proposal for real-time detection of jellyfish density in coastal areas from UAV images,” Erciyes Üniversitesi Fen Bilimleri Enstitüsü Fen Bilimleri Dergisi, vol. 39, no. 2, pp. 192–203, 2023.
  • [32] M. Hussain, “YOLO-v1 to YOLO-v8, the rise of YOLO and its complementary nature toward digital manufacturing and industrial defect detection,” Machines, vol. 11, no. 7, p. 677, 2023.
  • [33] D. Karaca, S. Uzun, and S. Kaçar, “A Yolov3-Based Garbage Detection Systems,” Journal of Smart Systems Research, vol. 4, no. 2, pp. 160–176, 2023.
  • [34] S. Uzun and D. Karaca, “Deep learning based garbage detection for autonomous garbage collection vehicles,” Avrupa Bilim ve Teknoloji Dergisi, vol. 32, pp. 1194–1198, 2022.
  • [35] G. Öztürk, R. Köker, O. Eldoğan, and D. Karayel, “Recognition of vehicles, pedestrians and traffic signs using convolutional neural networks,” in In 2020 4th International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), 2020.
  • [36] G. Öztürk, O. Eldoğan, and R. Köker, “Computer Vision-Based Lane Detection and Detection of Vehicle, Traffic Sign, Pedestrian Using YOLOv5,” Sakarya University Journal of Science, vol. 28, no. 2, pp. 418–430, 2024.
  • [37] F. M. Talaat and H. ZainEldin, “An improved fire detection approach based on YOLO-v8 for smart cities,” Neural Comput Appl, vol. 35, no. 28, pp. 20939–20954, 2023.
  • [38] D. T. Ergönül and O. Demir, “Real-Time Encrypted Traffic Classification with Deep Learning,” Sakarya University Journal of Science, vol. 26, no. 2, pp. 313–332, 2022.
  • [39] M. Sütçü, K. N. Şahin, Y. Koloğlu, M. E. Çelikel, and İ. T. Gülbahar, “Electricity load forecasting using deep learning and novel hybrid models,” Sakarya University Journal of Science, vol. 26, no. 1, pp. 91–104, 2022.
  • [40] Y. Bao, H. Lu, Q. Zhao, Z. Yang, W. Xu, and Y. Bao, “Detection system of dead and sick chickens in large scale farms based on artificial intelligence,” Mathematical Biosciences and Engineering, vol. 18, no. 5, p. 6117, 2021.
  • [41] W. A. Triyanto, K. , Adi, and J. E. Suseno, “Detection and Tracking of Broiler Flock Movements in The Chicken Coop using YOLO,” in In E3S Web of Conferences, 2023.
  • [42] B. L. Chen et al., “ Developing an automatic warning system for anomalous chicken dispersion and movement using deep learning and machine learning,” Poult Sci, vol. 102, no. 12, p. 103040, 2023.
  • [43] M. C. Bıngol and G. Bilgin, “Prediction of Chicken Diseases by Transfer Learning Method,” International Scientific and Vocational Studies Journa, vol. 7, no. 2, pp. 170–175, 2023.
  • [44] K. Fukushima, “Neocognitron: A Self-organizing Neural Network Model for a Mechanism of Pattern Recognition Unaffected by Shift in Position,” Biol Cybern, vol. 36, no. 4, pp. 193–202, 1980.
  • [45] H. İ. Şeker, “Bakır malzemelerde kaynak hatası tespiti için yeni bir derin öğrenme algoritması tasarımı,” Sakarya Uygulamalı Bilimler Üniversitesi , 2023.
  • [46] M. Pala, M. Cimen, M. Yildız, G. Cetinel, E. Avcıoglu, and Y. Alaca, “CNN-Based Approach for Overlapping Erythrocyte Counting and Cell Type Classification in Peripheral Blood Images,” Chaos Theory and Applications, vol. 4, no. 2, 2022.
  • [47] X. Yu, J. Wang, Q. Q. Hong, R. Teku, S. H. Wang, and Y. D. Zhang, “Transfer learning for medical images analyses: A survey,” Neurocomputing, vol. 489, pp. 230–254, 2022.
  • [48] J. Deng, W. Dong, R. Socher, L. J. Li, K. Li, and L. Fei-Fei, “ImageNet: A large-scale hierarchical image database,” in In 2009 IEEE conference on computer vision and pattern recognition, 2009, pp. 248–255.
  • [49] M. , Sandler, A. , Howard, M. , Zhu, A. , Zhmoginov, and L. C. Chen, “Mobilenetv2: Inverted residuals and linear bottlenecks,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2018, pp. 4510–4520.
  • [50] S. H. Wang and Y. D. Zhang, “DenseNet-201-based deep neural network with composite learning factor and precomputation for multiple sclerosis classification,” ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM, vol. 16, no. 2, pp. 1–19, 2020.
  • [51] A. Khamparia, D. Gupta, V. H. C. de Albuquerque, A. K. Sangaiah, and R. H. Jhaveri, “Internet of health things-driven deep learning system for detection and classification of cervical cells using transfer learning,” J Supercomput, vol. 76, 2020.
  • [52] S. H. Kassani, P. H. , Kassani, R. Khazaeinezhad, M. J. Wesolowski, K. A. Schneider, and R. Deters, “Diabetic retinopathy classification using a modified xception architecture,” in In 2019 IEEE international symposium on signal processing and information technology (ISSPIT), 2019, pp. 1–6.
  • [53] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, “Rethinking the inception architecture for computer vision,” in In Proceedings of the IEEE conference on computer vision and pattern recognition , 2016, pp. 2818–2826.
  • [54] F. Chollet, “Xception: Deep learning with depthwise separable convolutions,” in In Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 1251–1258.
  • [55] M. Rahimzadeh and A. Attar, “A modified deep convolutional neural network for detecting COVID-19 and pneumonia from chest X-ray images based on the concatenation of Xception and ResNet50V2,” Inform Med Unlocked, vol. 19, 2020.
  • [56] D. Sutaji and O. Yıldız, “LEMOXINET: Lite ensemble MobileNetV2 and Xception models to predict plant disease,” Ecol Inform, vol. 70, p. 101698, 2022.
  • [57] A. G. Howard et al., “Mobilenets: Efficient convolutional neural networks for mobile vision applications,” arXiv preprint arXiv:1704.04861.
  • [58] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in In Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 779–788.
  • [59] J. Redmon and A. Farhadi, “YOLO9000: better, faster, stronger,” in In proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 7363–7271. [60] RangeKing, “Brief summary of YOLOv8 model structure,” https://github.com/ultralytics/ultralytics/issues/189.

Comparison of Deep Learning and Yolov8 Models for Fox Detection Around the Henhouse

Yıl 2024, Cilt: 5 Sayı: 2, 76 - 90, 20.12.2024
https://doi.org/10.58769/joinssr.1498561

Öz

Human beings, who have been engaged in agriculture and animal husbandry for centuries, have to constantly track, take care and maintain their own agricultural lands and animals. This requires constant labor and time. The aim and originality of this study is to identify foxes that kidnap animals such as chickens, geese, ducks and turkeys that live in the coops of individuals engaged in poultry farming. In this way, even if the farmer is not in the henhouse at that moment, material and moral losses to the farmers will be prevented. To achieve this purpose, many images were collected to form dataset. The collected dataset was classified according to whether the fox was in the henhouse or not. Then, the outputs of DenseNet, MobileNet, ResNet50, VGG16, VGG19, Xception and Yolov8 architectures were fine tuned to be performed in transfer learning to detect existence of a fox in the henhouse. Then, the models were trained, and their performances were compared in terms of performance metrics such as loss, accuracy, precision and F1. In the results, Yolov8 architectures generally have demonstrated the best performances.

Kaynakça

  • [1] Ana Britannica Genel Kültür Ansiklopedisi . Ana yayıncılık , 2004.
  • [2] E. Koçak, “Düzce İli’nin CBS ve Uzaktan Algılama Tabanlı Tarım Coğrafyası,” PhD Thesis, Karabük Üniversitesi / Lisansüstü Eğitim Enstitüsü , Karabük, 2023.
  • [3] “TİGEM Hayvancılık Sektör Raporu,” 2020.
  • [4] A. Şekeroğlu and M. Sarıca, “Bir üretim sistemi olarak köy tavukçuluğu,” avukçuluk Araştırma Dergisi, vol. 9, no. 1, 2010.
  • [5] A. Özçağlar, Coğrafyaya giriş. Ümit Ofset Matbaacılık, 2014.
  • [6] M. Demirhan, “Erciş ilçesinde tarım ve hayvancılık faaliyetleri,” Atatürk Üniversitesi / Sosyal Bilimler Enstitüsü / Coğrafya Ana Bilim Dalı, 2023.
  • [7] H. Inci, R. Bural, and T. Şengül, “Bingöl İli Köy Tavukçuluğunun Yapısı,” Tavukçuluk Araştırma Dergisi, vol. 12, no. 2, pp. 13–17, 2015.
  • [8] M. Türkoğlu and H. Eleroğlu, “Serbest broiler yetiştiriciliği,” VIV Poultry Yutav, pp. 3–6, 1999.
  • [9] P. M. Kristjanson, A. Krishna, M. A. Radeny, and W. Nindo, “Pathways out of Povert in Western Kenya and the Role of Livestock,” Pro-poor livestock policy initiative working paper, 2004.
  • [10] J. C. Riise, A. Permin, C. V. Mcainsh, and L. Frederiksen, “Keeping village poultry a technical manual on small-scale poultry production,” Network for Small holder Poultry Development, 2004.
  • [11] J. W. Copland and R. G. Alders, “Australian village poultry development program in Asia and Africa,” Worlds Poult Sci J, pp. 61–1, 2005.
  • [12] TUIK, “Merkezi Dağıtım Sistemi,” https://biruni.tuik.gov.tr/medas/?kn=101&locale=tr.
  • [13] C. Tepeli and E. Arslan, Hayvan Yetiştiriciliğinde Temel Yaklaşımlar. Duvar Yayınları, 2023.
  • [14] A. Akgul, Y. Karaca, M. A. Pala, M. Çimen, A. F. Boz, and M. Z. Yildiz, “chaos theory, advanced metaheuristic algorithms and their newfangled deep learning architecture optimization applications: a review,” Fractals, vol. 32, no. 3, 2024.
  • [15] L. Chen, H. Zhang, J. Xiao, L. Nie, J. Shao, and W. Liu, “SCA-CNN : Spatial and Channel-wise Attention in Convolutional Networks for Image Captioning,” pp. 5659–5667.
  • [16] V. K. Chauhan, S. Singh, and A. Sharma, “HCR-Net: A deep learning based script independent handwritten character recognition network,” Multimed Tools Appl, pp. 1–35.
  • [17] C. Hema and F. P. G. Marquez, “Emotional speech recognition using cnn and deep learning techniques,” Applied Acoustics, vol. 211, p. 109492, 2023.
  • [18] M. A. Morid, O. R. L. Sheng, and J. Dunbar, “Time series prediction using deep learning methods in healthcare,” ACM Trans Manag Inf Syst, vol. 14, no. 1, pp. 1–29, 2023.
  • [19] O. T. Bişkin, “Multi-Step Forecasting Of Covid-19 Cases In European Countries Using Temporal Convolutional Networks,” Mugla Journal of Science and Technology, vol. 7, no. 1, pp. 117–126, 2021.
  • [20] S. Mukherjee, B. Sadhukhan, N. Sarkar, D. Roy, and S. De, “Stock market prediction using deep learning algorithms,” CAAI Trans Intell Technol, vol. 8, no. 1, pp. 82–94, 2023.
  • [21] Güven. E., M. Ersoy, and G. Çetin, “Brain Tumor Segmentation On Flair MR Images with U-Net,” Mugla Journal of Science and Technology, vol. 9, no. 1, pp. 34–41, 2023.
  • [22] P. P. I. Prasetyo, W. T. Sesulihatien, and A. R. Barakbah, “Behavioral Discrimination in Chicken Flocks: A Camera-Based Movement Analysis for Distinguishing Normal and Scared Behaviors,” Indonesian Journal of Computer Science, vol. 13, no. 1, 2024.
  • [23] R. Sasirekha, R. Kaviya, G. Saranya, A. Mohamed, and U. Iroda, “Smart Poultry House Monitoring System Using IoT,” in In E3S Web of Conferences, 2023.
  • [24] T. Diwan, G. Anirudh, and J. V. Tembhurne, “Object detection using YOLO: Challenges, architectural successors, datasets and applications,” Multimed Tools Appl, vol. 82, no. 6, 2023.
  • [25] K. Erin, B. Bingöl, and B. Boru, “YOLO–Based Waste Detection,” Journal of Smart Systems Research, vol. 3, no. 2, pp. 120–127, 2022.
  • [26] N. Jmour, S. Zayen, and A. Abdelkrim, “Convolutional neural networks for image classification,” in In 2018 international conference on advanced systems and electric technologies (IC_ASET), 2018, pp. 397–402.
  • [27] E. Şafak and N. Barışçı, “Real-time fire and smoke detection for mobile devices using deep learning,” Journal of the Faculty of Engineering and Architecture of Gazi University, vol. 38, no. 4, pp. 2179–2190, 2023.
  • [28] F. Eryılmaz and H. Karacan, “Akciğer X-Ray Görüntülerinden COVID-19 Tespitinde Hafif ve Geleneksel Evrişimsel Sinir Ağ Mimarilerinin Karşılaştırılması,” Düzce Üniversitesi Bilim ve Teknoloji Dergisi, vol. 9, no. 6, pp. 16–39, 202AD.
  • [29] Z. Yücel and D. Çetintaş, “YOLOV9 İle Kan Hücrelerinin Otomatik Tanımlanması: Optimizasyon Ve Öğrenme Oranı Etkileri,” Adıyaman Üniversitesi Mühendislik Bilimleri Dergisi, vol. 22, pp. 125–135, 2024.
  • [30] F. Kumral and A. Küçükmanisa, “Temporal Analysis Based Driver Drowsiness Detection System Using Deep Learning Approaches,” Sakarya University Journal of Science, vol. 26, no. 4, pp. 710–719, 2022.
  • [31] S. Dereli, M. Okuyar, and E. Güney, “A conceptual system proposal for real-time detection of jellyfish density in coastal areas from UAV images,” Erciyes Üniversitesi Fen Bilimleri Enstitüsü Fen Bilimleri Dergisi, vol. 39, no. 2, pp. 192–203, 2023.
  • [32] M. Hussain, “YOLO-v1 to YOLO-v8, the rise of YOLO and its complementary nature toward digital manufacturing and industrial defect detection,” Machines, vol. 11, no. 7, p. 677, 2023.
  • [33] D. Karaca, S. Uzun, and S. Kaçar, “A Yolov3-Based Garbage Detection Systems,” Journal of Smart Systems Research, vol. 4, no. 2, pp. 160–176, 2023.
  • [34] S. Uzun and D. Karaca, “Deep learning based garbage detection for autonomous garbage collection vehicles,” Avrupa Bilim ve Teknoloji Dergisi, vol. 32, pp. 1194–1198, 2022.
  • [35] G. Öztürk, R. Köker, O. Eldoğan, and D. Karayel, “Recognition of vehicles, pedestrians and traffic signs using convolutional neural networks,” in In 2020 4th International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), 2020.
  • [36] G. Öztürk, O. Eldoğan, and R. Köker, “Computer Vision-Based Lane Detection and Detection of Vehicle, Traffic Sign, Pedestrian Using YOLOv5,” Sakarya University Journal of Science, vol. 28, no. 2, pp. 418–430, 2024.
  • [37] F. M. Talaat and H. ZainEldin, “An improved fire detection approach based on YOLO-v8 for smart cities,” Neural Comput Appl, vol. 35, no. 28, pp. 20939–20954, 2023.
  • [38] D. T. Ergönül and O. Demir, “Real-Time Encrypted Traffic Classification with Deep Learning,” Sakarya University Journal of Science, vol. 26, no. 2, pp. 313–332, 2022.
  • [39] M. Sütçü, K. N. Şahin, Y. Koloğlu, M. E. Çelikel, and İ. T. Gülbahar, “Electricity load forecasting using deep learning and novel hybrid models,” Sakarya University Journal of Science, vol. 26, no. 1, pp. 91–104, 2022.
  • [40] Y. Bao, H. Lu, Q. Zhao, Z. Yang, W. Xu, and Y. Bao, “Detection system of dead and sick chickens in large scale farms based on artificial intelligence,” Mathematical Biosciences and Engineering, vol. 18, no. 5, p. 6117, 2021.
  • [41] W. A. Triyanto, K. , Adi, and J. E. Suseno, “Detection and Tracking of Broiler Flock Movements in The Chicken Coop using YOLO,” in In E3S Web of Conferences, 2023.
  • [42] B. L. Chen et al., “ Developing an automatic warning system for anomalous chicken dispersion and movement using deep learning and machine learning,” Poult Sci, vol. 102, no. 12, p. 103040, 2023.
  • [43] M. C. Bıngol and G. Bilgin, “Prediction of Chicken Diseases by Transfer Learning Method,” International Scientific and Vocational Studies Journa, vol. 7, no. 2, pp. 170–175, 2023.
  • [44] K. Fukushima, “Neocognitron: A Self-organizing Neural Network Model for a Mechanism of Pattern Recognition Unaffected by Shift in Position,” Biol Cybern, vol. 36, no. 4, pp. 193–202, 1980.
  • [45] H. İ. Şeker, “Bakır malzemelerde kaynak hatası tespiti için yeni bir derin öğrenme algoritması tasarımı,” Sakarya Uygulamalı Bilimler Üniversitesi , 2023.
  • [46] M. Pala, M. Cimen, M. Yildız, G. Cetinel, E. Avcıoglu, and Y. Alaca, “CNN-Based Approach for Overlapping Erythrocyte Counting and Cell Type Classification in Peripheral Blood Images,” Chaos Theory and Applications, vol. 4, no. 2, 2022.
  • [47] X. Yu, J. Wang, Q. Q. Hong, R. Teku, S. H. Wang, and Y. D. Zhang, “Transfer learning for medical images analyses: A survey,” Neurocomputing, vol. 489, pp. 230–254, 2022.
  • [48] J. Deng, W. Dong, R. Socher, L. J. Li, K. Li, and L. Fei-Fei, “ImageNet: A large-scale hierarchical image database,” in In 2009 IEEE conference on computer vision and pattern recognition, 2009, pp. 248–255.
  • [49] M. , Sandler, A. , Howard, M. , Zhu, A. , Zhmoginov, and L. C. Chen, “Mobilenetv2: Inverted residuals and linear bottlenecks,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2018, pp. 4510–4520.
  • [50] S. H. Wang and Y. D. Zhang, “DenseNet-201-based deep neural network with composite learning factor and precomputation for multiple sclerosis classification,” ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM, vol. 16, no. 2, pp. 1–19, 2020.
  • [51] A. Khamparia, D. Gupta, V. H. C. de Albuquerque, A. K. Sangaiah, and R. H. Jhaveri, “Internet of health things-driven deep learning system for detection and classification of cervical cells using transfer learning,” J Supercomput, vol. 76, 2020.
  • [52] S. H. Kassani, P. H. , Kassani, R. Khazaeinezhad, M. J. Wesolowski, K. A. Schneider, and R. Deters, “Diabetic retinopathy classification using a modified xception architecture,” in In 2019 IEEE international symposium on signal processing and information technology (ISSPIT), 2019, pp. 1–6.
  • [53] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, “Rethinking the inception architecture for computer vision,” in In Proceedings of the IEEE conference on computer vision and pattern recognition , 2016, pp. 2818–2826.
  • [54] F. Chollet, “Xception: Deep learning with depthwise separable convolutions,” in In Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 1251–1258.
  • [55] M. Rahimzadeh and A. Attar, “A modified deep convolutional neural network for detecting COVID-19 and pneumonia from chest X-ray images based on the concatenation of Xception and ResNet50V2,” Inform Med Unlocked, vol. 19, 2020.
  • [56] D. Sutaji and O. Yıldız, “LEMOXINET: Lite ensemble MobileNetV2 and Xception models to predict plant disease,” Ecol Inform, vol. 70, p. 101698, 2022.
  • [57] A. G. Howard et al., “Mobilenets: Efficient convolutional neural networks for mobile vision applications,” arXiv preprint arXiv:1704.04861.
  • [58] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in In Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 779–788.
  • [59] J. Redmon and A. Farhadi, “YOLO9000: better, faster, stronger,” in In proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 7363–7271. [60] RangeKing, “Brief summary of YOLOv8 model structure,” https://github.com/ultralytics/ultralytics/issues/189.
Toplam 59 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Derin Öğrenme
Bölüm Araştırma Makaleleri
Yazarlar

Murat Erhan Çimen 0000-0002-1793-485X

Yayımlanma Tarihi 20 Aralık 2024
Gönderilme Tarihi 10 Haziran 2024
Kabul Tarihi 18 Eylül 2024
Yayımlandığı Sayı Yıl 2024 Cilt: 5 Sayı: 2

Kaynak Göster

APA Çimen, M. E. (2024). Comparison of Deep Learning and Yolov8 Models for Fox Detection Around the Henhouse. Journal of Smart Systems Research, 5(2), 76-90. https://doi.org/10.58769/joinssr.1498561
AMA Çimen ME. Comparison of Deep Learning and Yolov8 Models for Fox Detection Around the Henhouse. JoinSSR. Aralık 2024;5(2):76-90. doi:10.58769/joinssr.1498561
Chicago Çimen, Murat Erhan. “Comparison of Deep Learning and Yolov8 Models for Fox Detection Around the Henhouse”. Journal of Smart Systems Research 5, sy. 2 (Aralık 2024): 76-90. https://doi.org/10.58769/joinssr.1498561.
EndNote Çimen ME (01 Aralık 2024) Comparison of Deep Learning and Yolov8 Models for Fox Detection Around the Henhouse. Journal of Smart Systems Research 5 2 76–90.
IEEE M. E. Çimen, “Comparison of Deep Learning and Yolov8 Models for Fox Detection Around the Henhouse”, JoinSSR, c. 5, sy. 2, ss. 76–90, 2024, doi: 10.58769/joinssr.1498561.
ISNAD Çimen, Murat Erhan. “Comparison of Deep Learning and Yolov8 Models for Fox Detection Around the Henhouse”. Journal of Smart Systems Research 5/2 (Aralık 2024), 76-90. https://doi.org/10.58769/joinssr.1498561.
JAMA Çimen ME. Comparison of Deep Learning and Yolov8 Models for Fox Detection Around the Henhouse. JoinSSR. 2024;5:76–90.
MLA Çimen, Murat Erhan. “Comparison of Deep Learning and Yolov8 Models for Fox Detection Around the Henhouse”. Journal of Smart Systems Research, c. 5, sy. 2, 2024, ss. 76-90, doi:10.58769/joinssr.1498561.
Vancouver Çimen ME. Comparison of Deep Learning and Yolov8 Models for Fox Detection Around the Henhouse. JoinSSR. 2024;5(2):76-90.