Araştırma Makalesi
BibTex RIS Kaynak Göster

Detection Of Pufferfish Using Computer Vision And Deep Learning Methods

Yıl 2025, Cilt: 12 Sayı: 2, 218 - 234, 03.05.2025

Öz

The opening of the Suez Canal and the construction of the Aswan Dam have significantly impacted the Mediterranean ecosystem. These changes increased species migration from the Red Sea to the Mediterranean, leading to the spread of new species, causing economic losses and threats to human health. Among these, the pufferfish is a toxic species with no natural predators and wide distribution.
This study focuses on training an object detection model to identify pufferfish (Lagocephalus sceleratus) using computer vision and deep learning techniques. YOLO (You Only Look Once), a leading algorithm, was used. Training data were gathered from diving schools and instructors in the Mediterranean. Frames extracted from underwater videos were labeled to create a dataset of 2,473 images.
The YOLOv8m version achieved the best result with a mAP of 96.90%. The model was better at detecting pufferfish from head and side angles. However, challenges in manual labeling, particularly with tails and fins, slightly affected the model’s focus.
This study’s findings could help control pufferfish populations using underwater robots and automated systems, contributing to ecological balance. Derived from the first author's master's thesis at Akdeniz University, it offers a foundation for future sustainable solutions.

Etik Beyan

This study received ethical approval from the Scientific Research and Publication Ethics Committee of Akdeniz University, with the decision dated September 5, 2023, under reference number 370.

Teşekkür

We would like to extend our gratitude to Professor Dr. Mehmet Gökoğlu, Associate Professor Dr. Ahmet Balcı, Star Diving Academy, Posseidon Kemer Diving, and Diving Instructor Orkun Tekin for their support during this study.

Kaynakça

  • [1] M. El-Raey, “Impacts and implications of climate change for the coastal zones of Egypt,” Coastal Zones Climate Change, vol. 7, pp. 31–50, 2010. [Online]. Available: https://www.stimson.org/wp-content/files/file-attachments/Mohamed_1.pdf Accessed on: May.5, 2023
  • [2] S. Mavruk and D. Avsar, “Non-native fishes in the Mediterranean from the Red Sea, by way of the Suez Canal,” Rev. Fish Biol. Fish., vol. 18, no. 3, pp. 251–262, 2008, DOI: 10.1007/s11160-007-9073-7.
  • [3] J. P. Rodrigue, The Geography of Transport Systems, 5th ed. New York, NY, USA: Routledge, 2020.
  • [4] D. Golani, “Impact of Red Sea fish migrants through the Suez Canal on the aquatic environment of the eastern Mediterranean,” Yale F&ES Bull., vol. 103, pp. 375–387, 1998.
  • [5] A. Zenetos, S. Gofas, C. Morri, A. Rosso, D. Violanti, J. E. García-Raso, and N. Streftaris, “Additions to the marine alien fauna of Greek waters (2007 update),” Mediterr. Mar. Sci., vol. 9, no. 1, pp. 119–165, 2008.
  • [6] M. E. Çınar, M. Bilecenoğlu, B. Öztürk, T. Katağan, M. B. Yokeş, V. Aysel, and E. Dağli, “An updated review of alien species on the coasts of Turkey,” Mediterr. Mar. Sci., vol. 12, no. 2, pp. 1–19, 2011.
  • [7] M. E. Çınar, M. Bilecenoğlu, M. B. Yokeş, B. Öztürk, E. Taşkın, K. Bakır, and S. Açık, “Current status (as of end of 2020) of marine alien species in Turkey,” PLoS ONE, vol. 16, no. 5, p. e0251086, 2021, DOI: 10.1371/journal.pone.0251086.
  • [8] O. Akyol, V. Ünal, T. Ceyhan, and M. Bilecenoğlu, “First confirmed record of the lionfish Pterois volitans (Linnaeus, 1758) in the Mediterranean Sea,” J. Fish Biol., vol. 66, no. 4, pp. 1183–1186, 2005.
  • [9] A. Zenetos, S. Gofas, M. Verlaque, M. E. Çinar, J. E. García Raso, C. N. Bianchi, and N. Streftaris, “Alien species in the Mediterranean Sea by 2010. A contribution to the application of European Union's Marine Strategy Framework Directive (MSFD). Part I. Spatial distribution,” Mediterr. Mar. Sci., vol. 11, no. 2, pp. 381–493, 2010.
  • [10] M. Kulbicki, Y. M. Bozec, P. Labrosse, Y. Letourneur, G. Mou-Tham, and L. Wantiez, “Diet composition of carnivorous fishes from coral reef lagoons of New Caledonia,” Aquat. Living Resour., vol. 18, no. 3, pp. 231–250, 2005.
  • [11] S. Tüzün, Yapay Zeka ve Uygulamaları. İstanbul, Türkiye: Papatya Yayıncılık Eğitim, 2012.
  • [12] V. Arvind, A. Ranjan, and D. Saha, “Machine learning techniques for fish detection and classification: A survey,” Aquaculture, vol. 504, pp. 331–345, 2019.
  • [13] M. Chuang, J. N. Hwang, K. Williams, and A. Das, “Fish species recognition from video using neural networks,” J. Vis. Commun. Image Represent., vol. 42, pp. 140–148, 2017.
  • [14] L. Kezebou, I. Zennouhi, Z. Bousalem, and A. Said, “Deep learning approaches for fish species recognition in underwater videos: Performance comparison,” Proc. Comput. Sci., vol. 151, pp. 113–120, 2019.
  • [15] C. Spampinato, J. Chen-Burger, R. B. Fisher, G. Nadarajan, and D. Giordano, “Detecting, tracking and counting fish in low quality unconstrained underwater videos,” in Proc. VISAPP, Madeira, Portugal, 2008, pp. 514–519.
  • [16] B. Xu and Z. Cheng, “Fish detection and tracking in real-life underwater environment,” Multimedia Tools Appl., vol. 76, no. 20, pp. 21171–21187, 2017.
  • [17] M. Ahmad, M. A. Khattak, and S. Ali, “The impact of AI on modern industry,” J. Innov. Manage., vol. 9, no. 4, pp. 78–95, 2021.
  • [18] P. Bharadiya, “Advanced AI techniques in financial markets: A review,” Financial Innov., vol. 9, no. 2, pp. 155–170, 2023.
  • [19] K. Crawford, The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. New Haven, CT, USA: Yale Univ. Press, 2021.
  • [20] Y. Duan, J. S. Edwards, and Y. K. Dwivedi, “Artificial intelligence for decision making in the era of Big Data–evolution, challenges and research agenda,” Int. J. Inf. Manage., vol. 48, pp. 63–71, 2019.
  • [21] T. Hwang, “Computational power and the social impact of artificial intelligence,” arXiv preprint, 2018. [Online]. Available: https://arxiv.org/abs/1803.08971 Accessed on: Oct.12, 2022
  • [22] Y. Lu, “Artificial intelligence: A survey on evolution, models, applications and future trends,” J. Manage. Anal., vol. 6, no. 1, pp. 1–29, 2019.
  • [23] C. Zhang and Y. Lu, “Study on image recognition of fish behavior based on machine learning,” Sci. Rep., vol. 11, no. 1, pp. 1–10, 2021.
  • [24] S. B. Maind and P. Wankar, “Research paper on basic of artificial neural network,” Int. J. Recent Innov. Trends Comput. Commun., vol. 2, no. 1, pp. 96–100, 2014.
  • [25] Y. Bengio, A. Courville, and P. Vincent, “Representation learning: A review and new perspectives,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 35, no. 8, pp. 1798–1828, 2013.
  • [26] W. Liu, Z. Wang, X. Liu, N. Zeng, Y. Liu, and F. E. Alsaadi, “A survey of deep neural network architectures and their applications,” Neurocomputing, vol. 234, pp. 11–26, 2017.
  • [27] Y. Guo, Y. Liu, A. Oerlemans, S. Lao, S. Wu, and M. S. Lew, “Deep learning for visual understanding: A review,” Neurocomputing, vol. 187, pp. 27–48, 2016.
  • [28] Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, 2015.
  • [29] S. Pouyanfar, S. Sadiq, Y. Yan, H. Tian, Y. Tao, M. P. Reyes, and S. S. Iyengar, “A survey on deep learning: Algorithms, techniques, and applications,” ACM Comput. Surv., vol. 51, no. 5, pp. 1–36, 2018.
  • [30] C. Beyan and H. I. Browman, “Fish recognition and species classification in underwater videos: A survey,” Fish. Res., vol. 226, p. 105521, 2020.
  • [31] W. Zhao, Y. Zhang, L. Yu, and S. Du, “Automatic fish classification system using deep learning,” Comput. Electron. Agric., vol. 170, p. 105254, 2019.
  • [32] D. Li, Y. Yang, Z. Yin, D. Luo, and S. Ma, “A novel deep learning approach for fish detection in real-life underwater environments,” Ocean Eng., vol. 250, p. 110867, 2022.
  • [33] H. Cui, Y. Zhang, X. Chen, and L. Wei, “Real-time fish tracking and classification in underwater videos using deep learning,” Pattern Recognit. Lett., vol. 140, pp. 1–8, 2020.
  • [34] H. Wang, L. Zhang, and J. Li, “End-to-end neural network for real-time detection and tracking of abnormal fish behavior,” Neurocomputing, vol. 494, pp. 375–385, 2022.
  • [35] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., Las Vegas, NV, USA, 2016, pp. 779–788.
  • [36] X. Zhou, D. Wang, and P. Krähenbühl, “Deep learning for automatic fish species classification,” Fish. Res., vol. 204, pp. 1–10, 2017.
  • [37] J. Redmon and A. Farhadi, “YOLOv3: An incremental improvement,” arXiv preprint, 2018. [Online]. Available: https://arxiv.org/abs/1804.02767 Accessed on: Jun.8, 2023
  • [38] K. Kim, K. Kim, and S. Jeong, “Application of YOLO v5 and v8 for recognition of safety risk factors at construction sites,” Sustainability, vol. 15, no. 20, p. 15179, 2023.
  • [39] A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, “YOLOv4: Optimal speed and accuracy of object detection,” arXiv preprint, 2020. [Online]. Available: https://arxiv.org/abs/2004.10934 Accessed on: Jan.20, 2023
  • [40] Ultralytics, “YOLOv5,” GitHub Repository, 2020. [Online]. Available: https://github.com/ultralytics/yolov5 Accessed on: Feb.5, 2023
  • [41] Thu-Mig, “YOLOv10,” GitHub Repository, 2024. [Online]. Available: https://github.com/THU-MIG/yolov10 Accessed on: May.10, 2024
  • [42] E. M. Ditria, S. Lopez-Marcano, M. Sievers, E. L. Jinks, and R. M. Connolly, “Deep learning for underwater fish detection: Accuracy and efficiency of human and automated methods,” Ecol. Inform., vol. 55, p. 101019, 2020.
  • [43] N. Wang, Y. Zhang, and J. Han, “Fish detection and classification in complex underwater environments using deep learning,” Inf. Sci., vol. 578, pp. 298–314, 2022.
  • [44] X. Li, Y. You, J. Wu, and H. Zhou, “Fish species recognition in underwater video based on deep convolutional neural networks,” J. Comput. Sci., vol. 10, pp. 124–133, 2015.
  • [45] S. Villon, D. Mouillot, M. Chaumont, and T. Claverie, “Automatic underwater fish species classification with limited data using deep learning,” Ecol. Inform., vol. 48, pp. 124–131, 2018.
  • [46] D. Hridayami, Y. Tjahyadi, and W. Widhiarso, “Deep learning-based fish classification system for real-time underwater fish monitoring,” Procedia Comput. Sci., vol. 157, pp. 447–453, 2019.
  • [47] V. Allken, N. O. Handegard, S. Rosen, T. Schreyeck, T. Mahiout, and K. Malde, “Fish species identification using a convolutional neural network trained on synthetic data,” ICES J. Mar. Sci., vol. 76, no. 1, pp. 342–349, 2019.
  • [48] A. Jalal, M. Nasir, and K. Kim, “Fish detection and species classification in underwater environments using deep learning with temporal information,” Sensors, vol. 20, no. 12, p. 3452, 2020.
  • [49] A. Salman, A. Jalal, and K. Kim, “Region-based convolutional neural network for fish detection in underwater environments,” Sensors, vol. 20, no. 5, p. 1400, 2020.
  • [50] M. Hussain, A. Jalal, and K. Kim, “Modified AlexNet for fish classification in underwater environments,” Comput. Electr. Eng., vol. 93, p. 107231, 2021.
  • [51] H. Wang, S. Zhu, M. Wang, and J. Ye, “Advances in deep learning: Applications in the healthcare sector,” J. Healthcare Eng., vol. 2022, no. 1, pp. 1–14, 2022.
  • [52] B. Patro, S. Mukhopadhyay, and P. P. Roy, “Real-time object detection using YOLOv5 for fish monitoring in adverse environments,” J. Ocean Technol., vol. 18, no. 2, pp. 61–73, 2023.
  • [53] Y. Eği, “YOLO v7 and computer vision-based mask-wearing warning system for congested public areas,” J. Inst. Sci. Technol., vol. 13, no. 1, pp. 22–32, 2023.
  • [54] Z. Yücel and D. Çetintaş, “Yolov9 ile kan hücrelerinin otomatik tanımlanması: Optimizasyon ve öğrenme oranı etkileri,” Adıyaman Üniv. Müh. Bilim. Derg., vol. 11, no. 22, pp. 125–135, 2024.
  • [55] D.-J. Shin and J.-J. Kim, “A deep learning framework performance evaluation to use YOLO in Nvidia Jetson platform,” Appl. Sci., vol. 12, no. 8, p. 3734, 2022.
  • [56] C. Li, Y. Zhang, X. Wu, and Y. Yang, “Deep learning-based fish behavior analysis in underwater videos,” Inf. Sci., vol. 579, pp. 92–105, 2022.
  • [57] T. Y. Lin, P. Goyal, R. Girshick, K. He, and P. Dollár, “Focal loss for dense object detection,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 42, no. 2, pp. 318–327, 2020.
  • [58] J. Terven and D. Cordova-Esparza, “A comprehensive review of YOLO architectures in computer vision: From YOLOv1 to YOLOv8 and YOLO-NAS,” arXiv preprint, 2023. [Online]. Available: https://arxiv.org/abs/2304.00501 Accessed on: Mar.15, 2024
  • [59] G. Jocher and M. R. Munawar, “Ultralytics YOLOv8,” GitHub Repository, 2023. [Online]. Available: https://github.com/ultralytics/yolov8 Accessed on: Mar.15, 2024
  • [60] V. Viswanatha, R. K. Chandana, and A. C. Ramachandra, “Real time object detection system with YOLO and CNN models: A review,” arXiv preprint, 2022. [Online]. Available: https://arxiv.org/abs/2208.00773 Accessed on: Mar.12, 2023
  • [61] P. Jiang, D. Ergu, F. Liu, Y. Cai, and B. Ma, “A review of Yolo algorithm developments,” Procedia Comput. Sci., vol. 199, pp. 1066–1073, 2022.
  • [62] M. Hussain, “YOLOv1 to v8: Unveiling each variant–A comprehensive review of YOLO,” IEEE Access, vol. 12, pp. 42816–42833, 2024.
  • [63] A. Vijayakumar and S. Vairavasundaram, “YOLO-based object detection models: A review and its applications,” Multimedia Tools Appl., vol. 83, no. 35, pp. 83535–83574, 2024, DOI: 10.1007/s11042-024-18872-y.
  • [64] C. F. G. D. Santos and J. P. Papa, “Avoiding overfitting: A survey on regularization methods for convolutional neural networks,” ACM Comput. Surv., vol. 54, no. 10, pp. 1–25, 2022.
  • [65] S. Aburass, “Quantifying overfitting: Introducing the overfitting index,” arXiv preprint, 2023. [Online]. Available: https://arxiv.org/abs/2308.08682 Accessed on: Oct.15, 2023
  • [66] L. Prechelt, “Early stopping — But when?,” in Neural Networks: Tricks of the Trade. Berlin, Germany: Springer, 2012.
  • [67] J. Brownlee, “A gentle introduction to early stopping to avoid overtraining neural networks,” Machine Learning Mastery, 2019. [Online]. Available: https://machinelearningmastery.com/how-to-stop-training-deep-neural-networks-at-the-right-time-using-early-stopping/, Accessed on: Apr. 1, 2024.
  • [68] S. Paguada, L. Batina, I. Buhan, and I. Armendariz, “Being patient and persistent: Optimizing an early stopping strategy for deep learning in profiled attacks,” IEEE Trans. Comput., vol. 74, no. 3, pp. 1–12, 2024.
  • [69] C. Zhu, Y. Zheng, K. Luu, and M. Savvides, “CMS-RCNN: Contextual multi-scale region-based CNN for unconstrained face detection,” in Deep Learning for Biometrics. Cham, Switzerland: Springer, 2017.
  • [70] T. Y. Lin, P. Goyal, R. Girshick, K. He, and P. Dollár, “Focal loss for dense object detection,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 42, no. 2, pp. 318–327, 2020.
  • [71] K. Su, L. Cao, B. Zhao, N. Li, D. Wu, and X. Han, “N-IoU: Better IoU-based bounding box regression loss for object detection,” Neural Comput. Appl., vol. 36, no. 6, pp. 3049–3063, 2024.
  • [72] P. Liu, G. Zhang, B. Wang, H. Xu, X. Liang, Y. Jiang, and Z. Li, “Loss function discovery for object detection via convergence-simulation driven search,” arXiv preprint, 2021. [Online]. Available: https://arxiv.org/abs/2102.04700 Accessed on: Apr.8, 2023
  • [73] P. Jayanthi, “Machine learning and deep learning algorithms in disease prediction: Future trends for the healthcare system,” in Deep Learning for Medical Applications with Unique Data. Cambridge, MA, USA: Academic Press, 2022.
  • [74] P. D. K. Reddy, M. Margala, S. S. Shankar, and P. Chakrabarti, “Early fire danger monitoring system in smart cities using optimization-based deep learning techniques with artificial intelligence,” J. Reliable Intell. Environ., vol. 10, no. 2, pp. 197–210, 2024.
  • [75] Z. Ding, L. Sun, X. Mao, L. Dai, and B. Xu, “Location-independent adversarial patch generation for object detection,” J. Electron. Imaging, vol. 32, no. 4, p. 043035, 2023.
  • [76] V. Kshirsagar, R. H. Bhalerao, and M. Chaturvedi, “Modified YOLO module for efficient object tracking in a video,” IEEE Latin Amer. Trans., vol. 21, no. 3, pp. 389–398, 2023.

Bilgisayarlı Görme ve Derin Öğrenme Yöntemleri Kullanılarak Balon Balığının Tespiti

Yıl 2025, Cilt: 12 Sayı: 2, 218 - 234, 03.05.2025

Öz

Süveyş Kanalı'nın açılması ve Aswan Barajı'nın inşası, Akdeniz ekosistemini önemli ölçüde etkilemiştir. Bu değişiklikler, Kızıldeniz'den Akdeniz'e türlerin göçünü artırarak yeni türlerin yayılmasına yol açmış, ekonomik kayıplara ve insan sağlığına yönelik tehditlere neden olmuştur. Bu türler arasında balon balığı, doğal avcısı olmayan ve geniş alanlara yayılan toksik bir türdür.
Bu çalışma, bilgisayarlı görüş ve derin öğrenme teknikleri kullanılarak balon balığının (Lagocephalus sceleratus) tespiti için bir nesne tanıma modeli eğitilmesine odaklanmaktadır. En gelişmiş algoritmalardan biri olan YOLO (You Only Look Once) kullanılmıştır. Modelin eğitimi için veriler, Akdeniz'deki dalış okulları ve eğitmenlerden toplanmıştır. Sualtı videolarından çıkarılan kareler etiketlenerek 2473 görüntüden oluşan bir veri seti oluşturulmuştur.
YOLOv8m sürümü, %96,90 mAP değeri ile en iyi sonucu elde etmiştir. Modelin, balon balığını baş ve yan açılardan daha iyi tespit ettiği görülmüştür. Ancak, manuel etiketleme sürecinde yaşanan zorluklar, özellikle balığın kuyruk ve yüzgeçlerinde, modelin odaklanmasını bir miktar etkilemiştir.
Bu çalışmanın bulguları, su altı robotları ve otomatik sistemlerle balon balığı popülasyonunun kontrol altına alınmasına yardımcı olabilir ve ekolojik dengeye katkı sağlayabilir. Akdeniz Üniversitesi'nde birinci yazarın aynı başlıklı yüksek lisans tezinden üretilen bu çalışma, gelecekteki sürdürülebilir çözümler için bir temel sunmaktadır.
Bu çalışma, birinci yazarın Akdeniz Üniversitesi, Yönetim Bilişim Sistemleri Anabilim Dalı'nda hazırladığı aynı başlıklı yüksek lisans tezinden üretilmiştir.

Kaynakça

  • [1] M. El-Raey, “Impacts and implications of climate change for the coastal zones of Egypt,” Coastal Zones Climate Change, vol. 7, pp. 31–50, 2010. [Online]. Available: https://www.stimson.org/wp-content/files/file-attachments/Mohamed_1.pdf Accessed on: May.5, 2023
  • [2] S. Mavruk and D. Avsar, “Non-native fishes in the Mediterranean from the Red Sea, by way of the Suez Canal,” Rev. Fish Biol. Fish., vol. 18, no. 3, pp. 251–262, 2008, DOI: 10.1007/s11160-007-9073-7.
  • [3] J. P. Rodrigue, The Geography of Transport Systems, 5th ed. New York, NY, USA: Routledge, 2020.
  • [4] D. Golani, “Impact of Red Sea fish migrants through the Suez Canal on the aquatic environment of the eastern Mediterranean,” Yale F&ES Bull., vol. 103, pp. 375–387, 1998.
  • [5] A. Zenetos, S. Gofas, C. Morri, A. Rosso, D. Violanti, J. E. García-Raso, and N. Streftaris, “Additions to the marine alien fauna of Greek waters (2007 update),” Mediterr. Mar. Sci., vol. 9, no. 1, pp. 119–165, 2008.
  • [6] M. E. Çınar, M. Bilecenoğlu, B. Öztürk, T. Katağan, M. B. Yokeş, V. Aysel, and E. Dağli, “An updated review of alien species on the coasts of Turkey,” Mediterr. Mar. Sci., vol. 12, no. 2, pp. 1–19, 2011.
  • [7] M. E. Çınar, M. Bilecenoğlu, M. B. Yokeş, B. Öztürk, E. Taşkın, K. Bakır, and S. Açık, “Current status (as of end of 2020) of marine alien species in Turkey,” PLoS ONE, vol. 16, no. 5, p. e0251086, 2021, DOI: 10.1371/journal.pone.0251086.
  • [8] O. Akyol, V. Ünal, T. Ceyhan, and M. Bilecenoğlu, “First confirmed record of the lionfish Pterois volitans (Linnaeus, 1758) in the Mediterranean Sea,” J. Fish Biol., vol. 66, no. 4, pp. 1183–1186, 2005.
  • [9] A. Zenetos, S. Gofas, M. Verlaque, M. E. Çinar, J. E. García Raso, C. N. Bianchi, and N. Streftaris, “Alien species in the Mediterranean Sea by 2010. A contribution to the application of European Union's Marine Strategy Framework Directive (MSFD). Part I. Spatial distribution,” Mediterr. Mar. Sci., vol. 11, no. 2, pp. 381–493, 2010.
  • [10] M. Kulbicki, Y. M. Bozec, P. Labrosse, Y. Letourneur, G. Mou-Tham, and L. Wantiez, “Diet composition of carnivorous fishes from coral reef lagoons of New Caledonia,” Aquat. Living Resour., vol. 18, no. 3, pp. 231–250, 2005.
  • [11] S. Tüzün, Yapay Zeka ve Uygulamaları. İstanbul, Türkiye: Papatya Yayıncılık Eğitim, 2012.
  • [12] V. Arvind, A. Ranjan, and D. Saha, “Machine learning techniques for fish detection and classification: A survey,” Aquaculture, vol. 504, pp. 331–345, 2019.
  • [13] M. Chuang, J. N. Hwang, K. Williams, and A. Das, “Fish species recognition from video using neural networks,” J. Vis. Commun. Image Represent., vol. 42, pp. 140–148, 2017.
  • [14] L. Kezebou, I. Zennouhi, Z. Bousalem, and A. Said, “Deep learning approaches for fish species recognition in underwater videos: Performance comparison,” Proc. Comput. Sci., vol. 151, pp. 113–120, 2019.
  • [15] C. Spampinato, J. Chen-Burger, R. B. Fisher, G. Nadarajan, and D. Giordano, “Detecting, tracking and counting fish in low quality unconstrained underwater videos,” in Proc. VISAPP, Madeira, Portugal, 2008, pp. 514–519.
  • [16] B. Xu and Z. Cheng, “Fish detection and tracking in real-life underwater environment,” Multimedia Tools Appl., vol. 76, no. 20, pp. 21171–21187, 2017.
  • [17] M. Ahmad, M. A. Khattak, and S. Ali, “The impact of AI on modern industry,” J. Innov. Manage., vol. 9, no. 4, pp. 78–95, 2021.
  • [18] P. Bharadiya, “Advanced AI techniques in financial markets: A review,” Financial Innov., vol. 9, no. 2, pp. 155–170, 2023.
  • [19] K. Crawford, The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. New Haven, CT, USA: Yale Univ. Press, 2021.
  • [20] Y. Duan, J. S. Edwards, and Y. K. Dwivedi, “Artificial intelligence for decision making in the era of Big Data–evolution, challenges and research agenda,” Int. J. Inf. Manage., vol. 48, pp. 63–71, 2019.
  • [21] T. Hwang, “Computational power and the social impact of artificial intelligence,” arXiv preprint, 2018. [Online]. Available: https://arxiv.org/abs/1803.08971 Accessed on: Oct.12, 2022
  • [22] Y. Lu, “Artificial intelligence: A survey on evolution, models, applications and future trends,” J. Manage. Anal., vol. 6, no. 1, pp. 1–29, 2019.
  • [23] C. Zhang and Y. Lu, “Study on image recognition of fish behavior based on machine learning,” Sci. Rep., vol. 11, no. 1, pp. 1–10, 2021.
  • [24] S. B. Maind and P. Wankar, “Research paper on basic of artificial neural network,” Int. J. Recent Innov. Trends Comput. Commun., vol. 2, no. 1, pp. 96–100, 2014.
  • [25] Y. Bengio, A. Courville, and P. Vincent, “Representation learning: A review and new perspectives,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 35, no. 8, pp. 1798–1828, 2013.
  • [26] W. Liu, Z. Wang, X. Liu, N. Zeng, Y. Liu, and F. E. Alsaadi, “A survey of deep neural network architectures and their applications,” Neurocomputing, vol. 234, pp. 11–26, 2017.
  • [27] Y. Guo, Y. Liu, A. Oerlemans, S. Lao, S. Wu, and M. S. Lew, “Deep learning for visual understanding: A review,” Neurocomputing, vol. 187, pp. 27–48, 2016.
  • [28] Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, 2015.
  • [29] S. Pouyanfar, S. Sadiq, Y. Yan, H. Tian, Y. Tao, M. P. Reyes, and S. S. Iyengar, “A survey on deep learning: Algorithms, techniques, and applications,” ACM Comput. Surv., vol. 51, no. 5, pp. 1–36, 2018.
  • [30] C. Beyan and H. I. Browman, “Fish recognition and species classification in underwater videos: A survey,” Fish. Res., vol. 226, p. 105521, 2020.
  • [31] W. Zhao, Y. Zhang, L. Yu, and S. Du, “Automatic fish classification system using deep learning,” Comput. Electron. Agric., vol. 170, p. 105254, 2019.
  • [32] D. Li, Y. Yang, Z. Yin, D. Luo, and S. Ma, “A novel deep learning approach for fish detection in real-life underwater environments,” Ocean Eng., vol. 250, p. 110867, 2022.
  • [33] H. Cui, Y. Zhang, X. Chen, and L. Wei, “Real-time fish tracking and classification in underwater videos using deep learning,” Pattern Recognit. Lett., vol. 140, pp. 1–8, 2020.
  • [34] H. Wang, L. Zhang, and J. Li, “End-to-end neural network for real-time detection and tracking of abnormal fish behavior,” Neurocomputing, vol. 494, pp. 375–385, 2022.
  • [35] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., Las Vegas, NV, USA, 2016, pp. 779–788.
  • [36] X. Zhou, D. Wang, and P. Krähenbühl, “Deep learning for automatic fish species classification,” Fish. Res., vol. 204, pp. 1–10, 2017.
  • [37] J. Redmon and A. Farhadi, “YOLOv3: An incremental improvement,” arXiv preprint, 2018. [Online]. Available: https://arxiv.org/abs/1804.02767 Accessed on: Jun.8, 2023
  • [38] K. Kim, K. Kim, and S. Jeong, “Application of YOLO v5 and v8 for recognition of safety risk factors at construction sites,” Sustainability, vol. 15, no. 20, p. 15179, 2023.
  • [39] A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, “YOLOv4: Optimal speed and accuracy of object detection,” arXiv preprint, 2020. [Online]. Available: https://arxiv.org/abs/2004.10934 Accessed on: Jan.20, 2023
  • [40] Ultralytics, “YOLOv5,” GitHub Repository, 2020. [Online]. Available: https://github.com/ultralytics/yolov5 Accessed on: Feb.5, 2023
  • [41] Thu-Mig, “YOLOv10,” GitHub Repository, 2024. [Online]. Available: https://github.com/THU-MIG/yolov10 Accessed on: May.10, 2024
  • [42] E. M. Ditria, S. Lopez-Marcano, M. Sievers, E. L. Jinks, and R. M. Connolly, “Deep learning for underwater fish detection: Accuracy and efficiency of human and automated methods,” Ecol. Inform., vol. 55, p. 101019, 2020.
  • [43] N. Wang, Y. Zhang, and J. Han, “Fish detection and classification in complex underwater environments using deep learning,” Inf. Sci., vol. 578, pp. 298–314, 2022.
  • [44] X. Li, Y. You, J. Wu, and H. Zhou, “Fish species recognition in underwater video based on deep convolutional neural networks,” J. Comput. Sci., vol. 10, pp. 124–133, 2015.
  • [45] S. Villon, D. Mouillot, M. Chaumont, and T. Claverie, “Automatic underwater fish species classification with limited data using deep learning,” Ecol. Inform., vol. 48, pp. 124–131, 2018.
  • [46] D. Hridayami, Y. Tjahyadi, and W. Widhiarso, “Deep learning-based fish classification system for real-time underwater fish monitoring,” Procedia Comput. Sci., vol. 157, pp. 447–453, 2019.
  • [47] V. Allken, N. O. Handegard, S. Rosen, T. Schreyeck, T. Mahiout, and K. Malde, “Fish species identification using a convolutional neural network trained on synthetic data,” ICES J. Mar. Sci., vol. 76, no. 1, pp. 342–349, 2019.
  • [48] A. Jalal, M. Nasir, and K. Kim, “Fish detection and species classification in underwater environments using deep learning with temporal information,” Sensors, vol. 20, no. 12, p. 3452, 2020.
  • [49] A. Salman, A. Jalal, and K. Kim, “Region-based convolutional neural network for fish detection in underwater environments,” Sensors, vol. 20, no. 5, p. 1400, 2020.
  • [50] M. Hussain, A. Jalal, and K. Kim, “Modified AlexNet for fish classification in underwater environments,” Comput. Electr. Eng., vol. 93, p. 107231, 2021.
  • [51] H. Wang, S. Zhu, M. Wang, and J. Ye, “Advances in deep learning: Applications in the healthcare sector,” J. Healthcare Eng., vol. 2022, no. 1, pp. 1–14, 2022.
  • [52] B. Patro, S. Mukhopadhyay, and P. P. Roy, “Real-time object detection using YOLOv5 for fish monitoring in adverse environments,” J. Ocean Technol., vol. 18, no. 2, pp. 61–73, 2023.
  • [53] Y. Eği, “YOLO v7 and computer vision-based mask-wearing warning system for congested public areas,” J. Inst. Sci. Technol., vol. 13, no. 1, pp. 22–32, 2023.
  • [54] Z. Yücel and D. Çetintaş, “Yolov9 ile kan hücrelerinin otomatik tanımlanması: Optimizasyon ve öğrenme oranı etkileri,” Adıyaman Üniv. Müh. Bilim. Derg., vol. 11, no. 22, pp. 125–135, 2024.
  • [55] D.-J. Shin and J.-J. Kim, “A deep learning framework performance evaluation to use YOLO in Nvidia Jetson platform,” Appl. Sci., vol. 12, no. 8, p. 3734, 2022.
  • [56] C. Li, Y. Zhang, X. Wu, and Y. Yang, “Deep learning-based fish behavior analysis in underwater videos,” Inf. Sci., vol. 579, pp. 92–105, 2022.
  • [57] T. Y. Lin, P. Goyal, R. Girshick, K. He, and P. Dollár, “Focal loss for dense object detection,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 42, no. 2, pp. 318–327, 2020.
  • [58] J. Terven and D. Cordova-Esparza, “A comprehensive review of YOLO architectures in computer vision: From YOLOv1 to YOLOv8 and YOLO-NAS,” arXiv preprint, 2023. [Online]. Available: https://arxiv.org/abs/2304.00501 Accessed on: Mar.15, 2024
  • [59] G. Jocher and M. R. Munawar, “Ultralytics YOLOv8,” GitHub Repository, 2023. [Online]. Available: https://github.com/ultralytics/yolov8 Accessed on: Mar.15, 2024
  • [60] V. Viswanatha, R. K. Chandana, and A. C. Ramachandra, “Real time object detection system with YOLO and CNN models: A review,” arXiv preprint, 2022. [Online]. Available: https://arxiv.org/abs/2208.00773 Accessed on: Mar.12, 2023
  • [61] P. Jiang, D. Ergu, F. Liu, Y. Cai, and B. Ma, “A review of Yolo algorithm developments,” Procedia Comput. Sci., vol. 199, pp. 1066–1073, 2022.
  • [62] M. Hussain, “YOLOv1 to v8: Unveiling each variant–A comprehensive review of YOLO,” IEEE Access, vol. 12, pp. 42816–42833, 2024.
  • [63] A. Vijayakumar and S. Vairavasundaram, “YOLO-based object detection models: A review and its applications,” Multimedia Tools Appl., vol. 83, no. 35, pp. 83535–83574, 2024, DOI: 10.1007/s11042-024-18872-y.
  • [64] C. F. G. D. Santos and J. P. Papa, “Avoiding overfitting: A survey on regularization methods for convolutional neural networks,” ACM Comput. Surv., vol. 54, no. 10, pp. 1–25, 2022.
  • [65] S. Aburass, “Quantifying overfitting: Introducing the overfitting index,” arXiv preprint, 2023. [Online]. Available: https://arxiv.org/abs/2308.08682 Accessed on: Oct.15, 2023
  • [66] L. Prechelt, “Early stopping — But when?,” in Neural Networks: Tricks of the Trade. Berlin, Germany: Springer, 2012.
  • [67] J. Brownlee, “A gentle introduction to early stopping to avoid overtraining neural networks,” Machine Learning Mastery, 2019. [Online]. Available: https://machinelearningmastery.com/how-to-stop-training-deep-neural-networks-at-the-right-time-using-early-stopping/, Accessed on: Apr. 1, 2024.
  • [68] S. Paguada, L. Batina, I. Buhan, and I. Armendariz, “Being patient and persistent: Optimizing an early stopping strategy for deep learning in profiled attacks,” IEEE Trans. Comput., vol. 74, no. 3, pp. 1–12, 2024.
  • [69] C. Zhu, Y. Zheng, K. Luu, and M. Savvides, “CMS-RCNN: Contextual multi-scale region-based CNN for unconstrained face detection,” in Deep Learning for Biometrics. Cham, Switzerland: Springer, 2017.
  • [70] T. Y. Lin, P. Goyal, R. Girshick, K. He, and P. Dollár, “Focal loss for dense object detection,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 42, no. 2, pp. 318–327, 2020.
  • [71] K. Su, L. Cao, B. Zhao, N. Li, D. Wu, and X. Han, “N-IoU: Better IoU-based bounding box regression loss for object detection,” Neural Comput. Appl., vol. 36, no. 6, pp. 3049–3063, 2024.
  • [72] P. Liu, G. Zhang, B. Wang, H. Xu, X. Liang, Y. Jiang, and Z. Li, “Loss function discovery for object detection via convergence-simulation driven search,” arXiv preprint, 2021. [Online]. Available: https://arxiv.org/abs/2102.04700 Accessed on: Apr.8, 2023
  • [73] P. Jayanthi, “Machine learning and deep learning algorithms in disease prediction: Future trends for the healthcare system,” in Deep Learning for Medical Applications with Unique Data. Cambridge, MA, USA: Academic Press, 2022.
  • [74] P. D. K. Reddy, M. Margala, S. S. Shankar, and P. Chakrabarti, “Early fire danger monitoring system in smart cities using optimization-based deep learning techniques with artificial intelligence,” J. Reliable Intell. Environ., vol. 10, no. 2, pp. 197–210, 2024.
  • [75] Z. Ding, L. Sun, X. Mao, L. Dai, and B. Xu, “Location-independent adversarial patch generation for object detection,” J. Electron. Imaging, vol. 32, no. 4, p. 043035, 2023.
  • [76] V. Kshirsagar, R. H. Bhalerao, and M. Chaturvedi, “Modified YOLO module for efficient object tracking in a video,” IEEE Latin Amer. Trans., vol. 21, no. 3, pp. 389–398, 2023.
Toplam 76 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Mühendislik Uygulaması
Bölüm Araştırma Makaleleri
Yazarlar

Hüseyin Umut Yüksel 0000-0003-1891-3644

Güray Tonguç 0000-0002-5476-7114

Yayımlanma Tarihi 3 Mayıs 2025
Gönderilme Tarihi 29 Ocak 2025
Kabul Tarihi 24 Mart 2025
Yayımlandığı Sayı Yıl 2025 Cilt: 12 Sayı: 2

Kaynak Göster

IEEE H. U. Yüksel ve G. Tonguç, “Detection Of Pufferfish Using Computer Vision And Deep Learning Methods”, ECJSE, c. 12, sy. 2, ss. 218–234, 2025, doi: 10.31202/ecjse.1628790.