Research Article
BibTex RIS Cite

Sigmoid-Gumbel: Yeni Bir Hibrit Aktivasyon Fonksiyonu

Year 2022, , 29 - 45, 24.03.2022
https://doi.org/10.17798/bitlisfen.990508

Abstract

Bu makalede daha önce sunulan Aktivaston Fonksiyonlarının (AF) olumlu yanlarını birleştiren ve onlardan daha iyi başarım sağlayan ve Sigmoid-Gumbel (SG) olarak adlandırılan yeni bir hibrit AF önerilmiştir. Önerilen fonksiyon, sigmoid, Gumbel, ReLU ve Adaptive Gumbel fonksiyonlarıyla karşılaştırılmıştır. Önerilen fonksiyonun başarımını gözlemek için dört uygulama yapılmıştır. Birinci uygulamada, derin öğrenmede ikili sınıflandırma sınıf dengesizliği sorunu için 25 dengesiz veri kümesi bir çok katmanlı algılayıcı (Multi Layer Perceptron-MLP) ağı üzerinde denenmiştir. İkinci uygulamada, önerilen fonksiyonun sigmoid ve Gumbel fonksiyonlarıyla karşılaştırılması için MNIST veri kümesi bir CNN ağı üzerinde denenmiştir. Üçüncü uygulamada, SGv2, SGv3 ve SGv4 olmak üzere SG’nin farklı üç versiyonu karşılaştırılmıştır. Bunun için Fashion-MNIST veri kümesi aynı CNN ağı üzerinde denenmiştir. Dördüncü uygulamada, önerilen fonksiyonun ReLU ve Adaptive Gumbel fonksiyonlarıyla karşılaştırılması için MNIST veri kümesi aynı CNN ağı üzerinde denenmiştir. Yapılan deney sonuçları önerilen AF’nin başarısını göstermiştir.

References

  • Süzen A.A., Yıldız Z., Yılmaz T. 2019. LSTM tabanlı Derin Sinir Ağı ile Ayak Taban Basınç Verilerinden VKİ Durumlarının Sınıflandırılması. Bitlis Eren Üniversitesi Fen Bilimleri Dergisi, 8 (4): 1392-1398.
  • Munkhdalai L., Munkhdalai T., Namsrai O.E., Lee J.Y. ve Ryu K.H. 2019. An Empirical Comparison of Machine-Learning Methods on Bank Client Credit Assessments. Sustainability, 11 (3).
  • Munkhdalai L., Wang L., Park H.W. ve Ryu K.H. 2019. Advanced Neural Network Approach, Its Explanation with LIME for Credit Scoring Application. Asian Conference on Intelligent Information and Database Systems (ACIIDS), Springer, Cham, 407-419.
  • Wang S., Liu W., Wu J., Cao L., Meng Q. ve Kennedy P.J. 2016. Training deep neural networks on imbalanced data sets. International Joint Conference on Neural Networks (IJCNN), 4368-4374.
  • Zhou F., Yang S., Fujita H., Chen D. ve Wen C. 2020. Deep learning fault diagnosis method based on global optimization GAN for unbalanced data. Knowledge-Based Systems, 187.
  • Goodfellow I., Bengio Y. ve Courville A. Deep Learning. http://www.deeplearningbook.org. (Erişim tarihi: 15.01.2021).
  • Manessi F. ve Rozza A. 2019. Learning Combinations of Activation Functions. arXiv:1801.09403v3 [cs.LG].
  • Munkhdalai L., Munkhdalai T. ve Ryu K.H. 2020. GEV-NN: A deep neural network architecture for class imbalance problem in binary classification. Knowledge-Based Systems, 194.
  • Apicella A., Isgrò F. ve Prevete R. 2019. A simple and efficient architecture for trainable activation functions. Neurocomputing, 370, 1-15.
  • Ding B., Qian H. ve Zhou J. 2018. Activation functions and their characteristics in deep neural networks. 30th Chinese Control And Decision Conference (CCDC), Shenyang, China, 1836-1841.
  • Babu K.V.N. ve Edla D.R. 2017. New Algebraic Activation Function for Multi-Layered Feed Forward Neural Networks. IETE Journal of Research, 63 (1): 71-79.
  • Ramachandran P., Zoph B. ve Le Q.V. 2017. Swish: a Self-Gated Activation Function. arXiv:1710.05941v1 [cs.NE].
  • Xu B., Huang R. ve Li M. 2016. Revise Saturated Activation Functions. arXiv:1602.05980v2 [cs.LG].
  • Agostinelli F., Hoffman M., Sadowski P. ve Baldi P. 2015. Learning activation functions to improve deep neural networks. arXiv:1412.6830v3 [cs.NE].
  • Zhou Y., Li D., Huo S. ve Kung S.Y. 2021. Shape autotuning activation function. Expert Systems with Applications, 171.
  • Farhadi F., Nia V.P. ve Lodi A. 2019. Activation Adaptation in Neural Networks. arXiv:1901.09849v2 [cs.LG].
  • Cai C., Xu Y., Ke D. ve Su K. 2015. Deep Neural Networks with Multistate Activation Functions. Hindawi Publishing Corporation, Computational Intelligence and Neuroscience, 2015, 1-10.
  • Koçak Y. ve Şiray G.Ü. 2021. New activation functions for single layer feedforward neural network. Expert Systems with Applications, 164.
  • Patterson J. ve Gibson A. 2017. Deep Learning A Practitioner’s Approach, 1. baskı, Sebastopol, California, USA, O’Reilly.
  • Dong B. ve Wang X. 2016. Comparison deep learning method to traditional methods using for network intrusion detection. 8th IEEE International Conference on Communication Software and Networks (ICCSN), 581-585.
  • Deng L. 2014. A tutorial survey of architectures, algorithms, and applications for deep learning. APSIPA Transactions on Signal and Information Processing, 3: 1-29.
  • Gumbel E.J. 1941. The Return Period of Flood Flows. Annals of Mathematical Statistics, 12 (2): 163-190.
  • Cooray K. 2010. Generalized Gumbel distribution. Journal of Applied Statistics, 37 (1): 171-179.
  • Zhang H., Liu G., Pan L., Meng K. ve Li J. 2016. GEV Regression with Convex Loss Applied to Imbalanced Binary Classification. IEEE First International Conference on Data Science in Cyberspace (DSC).
  • Huang J. ve Ling C.X. 2005. Using AUC and accuracy in evaluating learning algorithms. IEEE Transactions on Knowledge and Data Engineering, 17 (3): 299-310.
  • KEEL-dataset Data set repository. https://sci2s.ugr.es/keel/imbalanced.php. (Erişim tarihi: 05.01.2021).
  • LeCun Y., Cortes C. ve Burges C.J. 2021. The MNIST database of handwritten digits. http://yann.lecun.com/exdb/mnist/. (Erişim tarihi: 25.01.2021).
  • Xiao H., Rasul K. ve Vollgraf R. 2017. Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv:1708.07747v2 [cs.LG].
Year 2022, , 29 - 45, 24.03.2022
https://doi.org/10.17798/bitlisfen.990508

Abstract

References

  • Süzen A.A., Yıldız Z., Yılmaz T. 2019. LSTM tabanlı Derin Sinir Ağı ile Ayak Taban Basınç Verilerinden VKİ Durumlarının Sınıflandırılması. Bitlis Eren Üniversitesi Fen Bilimleri Dergisi, 8 (4): 1392-1398.
  • Munkhdalai L., Munkhdalai T., Namsrai O.E., Lee J.Y. ve Ryu K.H. 2019. An Empirical Comparison of Machine-Learning Methods on Bank Client Credit Assessments. Sustainability, 11 (3).
  • Munkhdalai L., Wang L., Park H.W. ve Ryu K.H. 2019. Advanced Neural Network Approach, Its Explanation with LIME for Credit Scoring Application. Asian Conference on Intelligent Information and Database Systems (ACIIDS), Springer, Cham, 407-419.
  • Wang S., Liu W., Wu J., Cao L., Meng Q. ve Kennedy P.J. 2016. Training deep neural networks on imbalanced data sets. International Joint Conference on Neural Networks (IJCNN), 4368-4374.
  • Zhou F., Yang S., Fujita H., Chen D. ve Wen C. 2020. Deep learning fault diagnosis method based on global optimization GAN for unbalanced data. Knowledge-Based Systems, 187.
  • Goodfellow I., Bengio Y. ve Courville A. Deep Learning. http://www.deeplearningbook.org. (Erişim tarihi: 15.01.2021).
  • Manessi F. ve Rozza A. 2019. Learning Combinations of Activation Functions. arXiv:1801.09403v3 [cs.LG].
  • Munkhdalai L., Munkhdalai T. ve Ryu K.H. 2020. GEV-NN: A deep neural network architecture for class imbalance problem in binary classification. Knowledge-Based Systems, 194.
  • Apicella A., Isgrò F. ve Prevete R. 2019. A simple and efficient architecture for trainable activation functions. Neurocomputing, 370, 1-15.
  • Ding B., Qian H. ve Zhou J. 2018. Activation functions and their characteristics in deep neural networks. 30th Chinese Control And Decision Conference (CCDC), Shenyang, China, 1836-1841.
  • Babu K.V.N. ve Edla D.R. 2017. New Algebraic Activation Function for Multi-Layered Feed Forward Neural Networks. IETE Journal of Research, 63 (1): 71-79.
  • Ramachandran P., Zoph B. ve Le Q.V. 2017. Swish: a Self-Gated Activation Function. arXiv:1710.05941v1 [cs.NE].
  • Xu B., Huang R. ve Li M. 2016. Revise Saturated Activation Functions. arXiv:1602.05980v2 [cs.LG].
  • Agostinelli F., Hoffman M., Sadowski P. ve Baldi P. 2015. Learning activation functions to improve deep neural networks. arXiv:1412.6830v3 [cs.NE].
  • Zhou Y., Li D., Huo S. ve Kung S.Y. 2021. Shape autotuning activation function. Expert Systems with Applications, 171.
  • Farhadi F., Nia V.P. ve Lodi A. 2019. Activation Adaptation in Neural Networks. arXiv:1901.09849v2 [cs.LG].
  • Cai C., Xu Y., Ke D. ve Su K. 2015. Deep Neural Networks with Multistate Activation Functions. Hindawi Publishing Corporation, Computational Intelligence and Neuroscience, 2015, 1-10.
  • Koçak Y. ve Şiray G.Ü. 2021. New activation functions for single layer feedforward neural network. Expert Systems with Applications, 164.
  • Patterson J. ve Gibson A. 2017. Deep Learning A Practitioner’s Approach, 1. baskı, Sebastopol, California, USA, O’Reilly.
  • Dong B. ve Wang X. 2016. Comparison deep learning method to traditional methods using for network intrusion detection. 8th IEEE International Conference on Communication Software and Networks (ICCSN), 581-585.
  • Deng L. 2014. A tutorial survey of architectures, algorithms, and applications for deep learning. APSIPA Transactions on Signal and Information Processing, 3: 1-29.
  • Gumbel E.J. 1941. The Return Period of Flood Flows. Annals of Mathematical Statistics, 12 (2): 163-190.
  • Cooray K. 2010. Generalized Gumbel distribution. Journal of Applied Statistics, 37 (1): 171-179.
  • Zhang H., Liu G., Pan L., Meng K. ve Li J. 2016. GEV Regression with Convex Loss Applied to Imbalanced Binary Classification. IEEE First International Conference on Data Science in Cyberspace (DSC).
  • Huang J. ve Ling C.X. 2005. Using AUC and accuracy in evaluating learning algorithms. IEEE Transactions on Knowledge and Data Engineering, 17 (3): 299-310.
  • KEEL-dataset Data set repository. https://sci2s.ugr.es/keel/imbalanced.php. (Erişim tarihi: 05.01.2021).
  • LeCun Y., Cortes C. ve Burges C.J. 2021. The MNIST database of handwritten digits. http://yann.lecun.com/exdb/mnist/. (Erişim tarihi: 25.01.2021).
  • Xiao H., Rasul K. ve Vollgraf R. 2017. Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv:1708.07747v2 [cs.LG].
There are 28 citations in total.

Details

Primary Language Turkish
Subjects Engineering
Journal Section Araştırma Makalesi
Authors

Mustafa Kaytan 0000-0002-8416-6520

İbrahim Berkan Aydilek 0000-0001-8037-8625

Celaleddin Yeroğlu 0000-0002-6106-2374

Ali Karci 0000-0002-8489-8617

Publication Date March 24, 2022
Submission Date September 3, 2021
Acceptance Date March 4, 2022
Published in Issue Year 2022

Cite

IEEE M. Kaytan, İ. B. Aydilek, C. Yeroğlu, and A. Karci, “Sigmoid-Gumbel: Yeni Bir Hibrit Aktivasyon Fonksiyonu”, Bitlis Eren Üniversitesi Fen Bilimleri Dergisi, vol. 11, no. 1, pp. 29–45, 2022, doi: 10.17798/bitlisfen.990508.



Bitlis Eren Üniversitesi
Fen Bilimleri Dergisi Editörlüğü

Bitlis Eren Üniversitesi Lisansüstü Eğitim Enstitüsü        
Beş Minare Mah. Ahmet Eren Bulvarı, Merkez Kampüs, 13000 BİTLİS        
E-posta: fbe@beu.edu.tr