Araştırma Makalesi
BibTex RIS Kaynak Göster

EVSEL ATIKLARIN DERİN ÖĞRENME TEKNİKLERİ İLE AYRIŞTIRILMASI

Yıl 2022, Cilt: Vol: 7 Sayı: Issue: 1, 10 - 19, 06.06.2022
https://doi.org/10.53070/bbd.1071536

Öz

Derin öğrenme teknolojisinin hızlı gelişimi sayesinde günlük yaşantımızın hemen hemen her noktasında kullanılan akıllı sistemler geliştirilmektedir. Geliştirilen uygulamalar hayatımızı kolaylaştırdığı gibi doğaya da olumlu katkılar sağlamıştır. Geleneksel atık ayrıştırma yöntemleri, verimlilik ve doğruluk açısından yetersiz kalmaktadır. Ayrıca yüksek maliyetli olmasının yanında çevresel riskler bakımdan da sıkıntılar doğurabilir. Son yıllarda, yapay zekâ, makine öğrenmesi ve beraberinde getirdiği derin öğrenme teknikleri organik, evsel ve ambalaj atıkların ayrıştırılması gibi karmaşık problemlerin çözümünde popüler bir yöntem olmuştur. Bu çalışmada, hem insan ve canlı yaşamı hem de doğanın korunması açısından büyük öneme sahip olan evsel atıkların ayrıştırılması problemi ele alınmıştır. Yapay zekâ kümesinde yer alan makine öğrenmesinin bir alt kolu olan derin öğrenme ile evsel atıkların tespit edilip ayrıştırılması için popüler konvansiyonel sinir ağı (CNN) tabanlı ResNet-50, DenseNet-121, Inception-V3, VGG16 mimarileri kullanılarak sınıflandırma performansları karşılaştırılmıştır.

Kaynakça

  • (2019, Temmuz 3). Aralık 7, 2021 tarihinde BBC: https://www.bbc.com/turkce/haberler-dunya-48851661 adresinden alındı
  • Alom, M. Z., Taha, T. M., Yakopcic, C., Westberg, S., Sidike, P., Nasrin, M. S., . . . Asari, V. K. (2018). The history began from alexnet: A comprehensive survey on deep learning approaches. arXiv preprint arXiv:1803.01164.
  • Beliën, J., Boeck, L. D., & Ackere, J. V. (2011). Municipal Solid Waste Collection Problems: A Literature Review.
  • Bircanoğlu, C., Atay, M., Beşer, F., Genç, Ö., & Kızrak, M. (2018). RecycleNet: Intelligent waste sorting using deep neural networks. In 2018 Innovations in Intelligent Systems and Applications (INISTA), (s. 1-7).
  • Doğan, F., & Türkoğlu, İ. (2019). Derin öğrenme modelleri ve uygulama alanlarına ilişkin bir derleme. Dicle Üniversitesi Mühendislik Fakültesi Mühendislik Dergisi, 409-445.
  • El Naqa, I., Haider, M. A., Giger, M. L., & Ten Haken, R. K. (2020). Artificial Intelligence: reshaping the practice of radiological sciences in the 21st century. The British journal of radiology.
  • Evans, G. W. (2017). Artificial intelligence: where we came from, where we are now, and where we are going.
  • Ferguson, M., Ak, R., Lee, Y. T., & Law, K. H. (2017). Automatic localization of casting defects with convolutional neural networks. In 2017 IEEE international conference on big data (big data), (s. 1726-1735).
  • Gulec, N., Gunal, B., & Erler, A. (2001). Assessment of soil and water contamination around an ash-disposal site: a case study from the Seyitömer coal-fired power plant in western Turkey.
  • He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In Proceedings of the IEEE international conference on computer vision, (s. 1026-1034).
  • Karen, S., & Zisserman, A. (2014). Deep Convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556.
  • LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. nature, 436-444.
  • Li, X., Shen, X., Zhou, Y., Wang, X., & Li, T. Q. (2020). Classification of breast cancer histopathological images using interleaved DenseNet with SENet (IDSNet). PloS one, 15(5).
  • Liu, C., Sharan, L., Adelson,, E., & Rosenholtz,, R. (2010, June). Exploring features in a bayesian framework for material recognition. In 2010 ieee computer society conference on computer vision and pattern recognition, (s. 239-246).
  • Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., & Wojna, Z. (2016). Rethinking the inception architecture for computer vision. In Proceedings of the IEEE conference on computer vision and pattern recognition, (s. 2818-2826).
  • Themelis, N. J. (2002). Integrated management of solid wastes for New York City. In North American Waste-to-Energy Conference, (s. 69-86).
  • Turing, A. M. (1950). Computing machinery and intelligence. Cambridge: MA: MIT Press.
  • Wikipedia. (2021, Ekim 24). Aralık 7, 2021 tarihinde https://tr.wikipedia.org/wiki/ENIAC adresinden alındı
  • www.kaggle.com. (2019, Kasım 12). Aralık 7, 2021 tarihinde https://www.kaggle.com/imrshu/solidwaste adresinden alındı
  • Xia, X., Xu, C., & Nan, B. (2017, June). Inception-v3 for flower classification. In 2017 2nd International Conference on Image, Vision and Computing (ICIVC), (s. 783-787).
  • Ziadat, A., & Mott, H. (2005). Assessing solid waste recycling opportunities for closed campuses. Management of Environmental Quality: An International Journal, 250-256.

SEPARATION OF DOMESTIC WASTE WITH DEEP LEARNING TECHNIQUES

Yıl 2022, Cilt: Vol: 7 Sayı: Issue: 1, 10 - 19, 06.06.2022
https://doi.org/10.53070/bbd.1071536

Öz

Thanks to the rapid development of deep learning technology, smart systems used in almost every part of our daily life are being developed. Developed applications not only made our lives easier, but also contributed positively to nature. Traditional waste separation methods fall short in terms of efficiency and accuracy. In addition to its high cost, it can also cause problems in terms of environmental risks. In recent years, artificial intelligence, machine learning and the deep learning techniques it brings have become a popular method for solving complex problems such as organic, household and packaging waste sorting. In this study, the problem of separation of domestic wastes, which is of great importance in terms of both human and living life and the protection of nature, is discussed. In the artificial intelligence cluster; Classification performances were compared by using popular conventional neural network (CNN) based ResNet-50, DenseNet-121, Inception-V3, VGG16 architectures to detect and sort household waste with deep learning, a sub-branch of machine learning.

Kaynakça

  • (2019, Temmuz 3). Aralık 7, 2021 tarihinde BBC: https://www.bbc.com/turkce/haberler-dunya-48851661 adresinden alındı
  • Alom, M. Z., Taha, T. M., Yakopcic, C., Westberg, S., Sidike, P., Nasrin, M. S., . . . Asari, V. K. (2018). The history began from alexnet: A comprehensive survey on deep learning approaches. arXiv preprint arXiv:1803.01164.
  • Beliën, J., Boeck, L. D., & Ackere, J. V. (2011). Municipal Solid Waste Collection Problems: A Literature Review.
  • Bircanoğlu, C., Atay, M., Beşer, F., Genç, Ö., & Kızrak, M. (2018). RecycleNet: Intelligent waste sorting using deep neural networks. In 2018 Innovations in Intelligent Systems and Applications (INISTA), (s. 1-7).
  • Doğan, F., & Türkoğlu, İ. (2019). Derin öğrenme modelleri ve uygulama alanlarına ilişkin bir derleme. Dicle Üniversitesi Mühendislik Fakültesi Mühendislik Dergisi, 409-445.
  • El Naqa, I., Haider, M. A., Giger, M. L., & Ten Haken, R. K. (2020). Artificial Intelligence: reshaping the practice of radiological sciences in the 21st century. The British journal of radiology.
  • Evans, G. W. (2017). Artificial intelligence: where we came from, where we are now, and where we are going.
  • Ferguson, M., Ak, R., Lee, Y. T., & Law, K. H. (2017). Automatic localization of casting defects with convolutional neural networks. In 2017 IEEE international conference on big data (big data), (s. 1726-1735).
  • Gulec, N., Gunal, B., & Erler, A. (2001). Assessment of soil and water contamination around an ash-disposal site: a case study from the Seyitömer coal-fired power plant in western Turkey.
  • He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In Proceedings of the IEEE international conference on computer vision, (s. 1026-1034).
  • Karen, S., & Zisserman, A. (2014). Deep Convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556.
  • LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. nature, 436-444.
  • Li, X., Shen, X., Zhou, Y., Wang, X., & Li, T. Q. (2020). Classification of breast cancer histopathological images using interleaved DenseNet with SENet (IDSNet). PloS one, 15(5).
  • Liu, C., Sharan, L., Adelson,, E., & Rosenholtz,, R. (2010, June). Exploring features in a bayesian framework for material recognition. In 2010 ieee computer society conference on computer vision and pattern recognition, (s. 239-246).
  • Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., & Wojna, Z. (2016). Rethinking the inception architecture for computer vision. In Proceedings of the IEEE conference on computer vision and pattern recognition, (s. 2818-2826).
  • Themelis, N. J. (2002). Integrated management of solid wastes for New York City. In North American Waste-to-Energy Conference, (s. 69-86).
  • Turing, A. M. (1950). Computing machinery and intelligence. Cambridge: MA: MIT Press.
  • Wikipedia. (2021, Ekim 24). Aralık 7, 2021 tarihinde https://tr.wikipedia.org/wiki/ENIAC adresinden alındı
  • www.kaggle.com. (2019, Kasım 12). Aralık 7, 2021 tarihinde https://www.kaggle.com/imrshu/solidwaste adresinden alındı
  • Xia, X., Xu, C., & Nan, B. (2017, June). Inception-v3 for flower classification. In 2017 2nd International Conference on Image, Vision and Computing (ICIVC), (s. 783-787).
  • Ziadat, A., & Mott, H. (2005). Assessing solid waste recycling opportunities for closed campuses. Management of Environmental Quality: An International Journal, 250-256.
Toplam 21 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Yapay Zeka
Bölüm PAPERS
Yazarlar

Yunus Emre Karaca 0000-0002-9398-084X

Serpil Aslan 0000-0001-8009-063X

Cengiz Hark 0000-0002-5190-3504

Erken Görünüm Tarihi 5 Haziran 2022
Yayımlanma Tarihi 6 Haziran 2022
Gönderilme Tarihi 10 Şubat 2022
Kabul Tarihi 19 Mart 2022
Yayımlandığı Sayı Yıl 2022 Cilt: Vol: 7 Sayı: Issue: 1

Kaynak Göster

APA Karaca, Y. E., Aslan, S., & Hark, C. (2022). EVSEL ATIKLARIN DERİN ÖĞRENME TEKNİKLERİ İLE AYRIŞTIRILMASI. Computer Science, Vol: 7(Issue: 1), 10-19. https://doi.org/10.53070/bbd.1071536

The Creative Commons Attribution 4.0 International License 88x31.png  is applied to all research papers published by JCS and

a Digital Object Identifier (DOI)     Logo_TM.png  is assigned for each published paper.