Araştırma Makalesi
BibTex RIS Kaynak Göster

The Effect of Hidden Neurons in Single-Hidden Layer Feedforward Neural Networks

Yıl 2019, Cilt: 12 Sayı: 4, 277 - 286, 29.10.2019
https://doi.org/10.17671/gazibtd.465886

Öz

Especially in the last decade, Artificial Intelligence is gaining popularity increasingly since deep learning and neural networks have fast and powerful machine learning-based techniques that can solve many real-time problems efficiently. In this study, Extreme Learning Machine (ELM), capable of high and fast learning is used for optimization parameters of Single hidden Layer Feedforward Neural networks (SLFN)s. The learning capability of such systems is directly related to the effectiveness of the parameters and the calculation methods. Hidden neurons number, one of the parameters in the calculations is discussed and its role is examined. The importance of the appropriate selection of this value will not only be emphasized but also a new method will be proposed for proper selection. The proposed method, Normalized Average Value (NAV) is a simple and effective formulation that originates from statistical methods in this field. Experimental results for determining a correct number of hidden neurons (L) show that random selection of this number causes either overfitting or under fitting problems. NAV can improve any algorithm in order to reach better learning rates. The results show that it provides a 10-15% performance increase due to random selection if the number of hidden neurons, L is determined according to the result of the study.

Kaynakça

  • X. Xue, M. Yao, Z. Wu, “A novel ensemble-based wrapper method for feature selection using extreme learning machine and genetic algorithm”, Knowledge and Information Systems, 1-24, 2017.
  • A. Deniz, H. E. Kiziloz, T. Dokeroglu, A. Coşar, “A Robust multi-objective evolutionary feature subset selection algorithm for binary classification using machine-learning techniques”, Neurocomputing, 241, 128-146, 2017.
  • Internet: UCI Irvine Machine Learning Repository, http://archive.ics.uci.edu/ml/datasets.html, October 2018.
  • E. Sevinç, “Activation Functions in Single Hidden Layer Feed-forward Neural Networks”, Selçuk-Teknik Dergisi, 1-13, 2018.
  • Internet: ELM Classification web page, http://www.ntu.edu.sg/home/egbhuang/elm_random_hidden_nodes.html, October 2018.
  • G. B. Huang, QY. Zhu, CK. Siew, “Extreme learning machine, Theory and applications”, Neurocomputing, 70, 489-501, 2006.
  • G. B. Huang, X. Ding, H. Zhou, “Optimization method based extreme learning machine for classification”, Neurocomputing, 74, 155-163, 2010.
  • G. B. Huang, H. Zhou, X. Ding, “Extreme learning machine for regression and multiclass classification”, IEEE Transactions on Systems, Man and Cybernetics, Part B(Cybernetics), 42, 513-529, 2012.
  • H. C. Yuan, F. L. Xiong, X. Y. Huai, “A method for estimating the number of hidden neurons in feed-forward neural networks based on information entropy”, Computers and Electronics in Agriculture, 40, 57-64, 2003.
  • G. Feng, Z. Qian, X. Zhang, “Evolutionary selection extreme learning machine optimization for regression”, Soft Computing, 16, 1485-1491, 2012.
  • G. B. Huang, H. A. Babri, “Upper Bounds on the Number of Hidden Neurons in Feedforward Networks with Arbitrary Bounded Nonlinear Activation Functions”, IEEE Transactions on Neural Networks, 9(1), January 1998.
  • E. J. Teoh, K. C. Tan, C. Xiang, “Estimating the Number of Hidden Neurons in a Feedforward Network Using the Singular Value Decomposition”, IEEE Transactions on Neural Networks, 17(6), November 2006.
  • A. Akbaş, H. U. Yildiz, A. M. Ozbayoglu, B. Tavli,” Neural network based instant parameter prediction for wireless sensor network optimization models”, Wireless Networks, 1–14, 2018.
  • M. Karakaya, E. Sevinç, “An Efficient Genetic Algorithm for Routing Multiple UAVs under Flight Range and Service Time Window Constraints”, Bilişim Teknolojileri Dergisi, 10(1), 113, 2017.
  • K. G. Sheela, S. N. Deepa, “Review on Methods to Fix Number of Hidden Neurons in Neural Networks”, Mathematical Problems in Engineering, 2013, 2013.
  • E. Sevinc, T. Dokeroglu, “A novel hybrid teaching learning based optimization algorithm for the classification of data by using extreme learning machines”, Turkish Journal of Electrical Engineering & Computer Sciences, 27(2), 1523-1533, 2019.
  • T. Dokeroglu, E. Sevinc, “Evolutionary parallel extreme learning machines for the data classification problem”, Computers & Industrial Engineering, 130, 237-249, 2019.
  • X. Zeng, D. S. Yeung, “Hidden neuron pruning of multilayer perceptrons using a quantified sensitivity measure”, Neurocomputing, 69(7-9), 825-837, 2006.
  • M. G. Augasta, T. Kathirvalavakumar, “A novel pruning algorithm for optimizing feedforward neural network of classification problems”, Neural processing letters, 34(3), 241, 2011.
  • C. R. Rao, S. K. Mitra, Generalized Inverse of Matrices and its Applications, John Wiley and Sons Inc., New York, USA, 1971.
  • Y. Lan, Y. C. Soh, G. B. Huang, “Two-stage extreme learning machine for regression”, Neurocomputing, 73, 3028-3038, 2010.
  • C. Hamzaçebi, F. Kutay, "Durağan Zaman Serilerinin Yapay Sinir Ağları Ile Tahmininde Girdi Nöronu Ve Gizli Nöron Sayısının Belirlenmesi", TÜİK İstatistik Araştırma Dergisi, 4, 2005.
  • R. Kohavi, “A study of cross-validation and bootstrap for accuracy estimation and model selection”, IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence, vol. 2, 1137-1143, 1995.
  • M. H. Calp, “İşletmeler için personel yemek talep miktarının yapay sinir ağları kullanılarak tahmin edilmesi”, Politeknik Dergisi, 22(3): 675-686, (2019).

Tek Gizli Katmanlı İleri Beslemeli Sinir Ağlarında Gizli Nöronların Etkisi

Yıl 2019, Cilt: 12 Sayı: 4, 277 - 286, 29.10.2019
https://doi.org/10.17671/gazibtd.465886

Öz

Son on yılda Yapay Zeka, derin öğrenme ve sinir ağlarının gerçek zamanlı birçok problemi inanılmaz derecede hızla çözebilen güçlü makine öğrenmesi teknikleri kullanmaları, bu konuya giderek artan bir popülerlik kazandırmaktadır. Bu çalışmada, İleri Beslemeli Sinir Ağları’nda kullanılan parametrelerin iyileştirilmesi için hızlı öğrenme yeteneğine sahip Olağanüstü Öğrenme Makinaları (OÖM) kullanılmaktadır. Bu tür sistemlerin öğrenme kapasitesi, parametrelerin ve hesaplama yöntemlerinin etkinliği ile doğrudan ilişkilidir. Burada kullanılan parametrelerinden biri olan gizli nöron sayısı tartışılacak ve değişik durumlardaki performansı incelenecektir. Bu değerin uygun seçiminin önemini vurgulanacak ve uygun seçim için yeni bir yöntem önerilecektir. Önerilen yöntem, Normalize Ortalama Değer (NOD) bu alandaki istatistiki metotları temel alan basit ve etkili bir formülasyondur. Doğru gizli nöron sayısı (L) değerini belirlemeye yönelik yapılan deneysel sonuçlar, L'nin rastgele seçilmesinin, aşırı ya da yetersiz uyum gibi problemlere neden olabileceğini göstermektedir. NOD daha iyi öğrenme seviyelerine erişme imkanı sağlamaktadır. Deneysel sonuçlar, gizli nöronların sayısı belirlenmesi durumunda, %10-15'lik bir performans artışı sağlandığını göstermektedir.

Kaynakça

  • X. Xue, M. Yao, Z. Wu, “A novel ensemble-based wrapper method for feature selection using extreme learning machine and genetic algorithm”, Knowledge and Information Systems, 1-24, 2017.
  • A. Deniz, H. E. Kiziloz, T. Dokeroglu, A. Coşar, “A Robust multi-objective evolutionary feature subset selection algorithm for binary classification using machine-learning techniques”, Neurocomputing, 241, 128-146, 2017.
  • Internet: UCI Irvine Machine Learning Repository, http://archive.ics.uci.edu/ml/datasets.html, October 2018.
  • E. Sevinç, “Activation Functions in Single Hidden Layer Feed-forward Neural Networks”, Selçuk-Teknik Dergisi, 1-13, 2018.
  • Internet: ELM Classification web page, http://www.ntu.edu.sg/home/egbhuang/elm_random_hidden_nodes.html, October 2018.
  • G. B. Huang, QY. Zhu, CK. Siew, “Extreme learning machine, Theory and applications”, Neurocomputing, 70, 489-501, 2006.
  • G. B. Huang, X. Ding, H. Zhou, “Optimization method based extreme learning machine for classification”, Neurocomputing, 74, 155-163, 2010.
  • G. B. Huang, H. Zhou, X. Ding, “Extreme learning machine for regression and multiclass classification”, IEEE Transactions on Systems, Man and Cybernetics, Part B(Cybernetics), 42, 513-529, 2012.
  • H. C. Yuan, F. L. Xiong, X. Y. Huai, “A method for estimating the number of hidden neurons in feed-forward neural networks based on information entropy”, Computers and Electronics in Agriculture, 40, 57-64, 2003.
  • G. Feng, Z. Qian, X. Zhang, “Evolutionary selection extreme learning machine optimization for regression”, Soft Computing, 16, 1485-1491, 2012.
  • G. B. Huang, H. A. Babri, “Upper Bounds on the Number of Hidden Neurons in Feedforward Networks with Arbitrary Bounded Nonlinear Activation Functions”, IEEE Transactions on Neural Networks, 9(1), January 1998.
  • E. J. Teoh, K. C. Tan, C. Xiang, “Estimating the Number of Hidden Neurons in a Feedforward Network Using the Singular Value Decomposition”, IEEE Transactions on Neural Networks, 17(6), November 2006.
  • A. Akbaş, H. U. Yildiz, A. M. Ozbayoglu, B. Tavli,” Neural network based instant parameter prediction for wireless sensor network optimization models”, Wireless Networks, 1–14, 2018.
  • M. Karakaya, E. Sevinç, “An Efficient Genetic Algorithm for Routing Multiple UAVs under Flight Range and Service Time Window Constraints”, Bilişim Teknolojileri Dergisi, 10(1), 113, 2017.
  • K. G. Sheela, S. N. Deepa, “Review on Methods to Fix Number of Hidden Neurons in Neural Networks”, Mathematical Problems in Engineering, 2013, 2013.
  • E. Sevinc, T. Dokeroglu, “A novel hybrid teaching learning based optimization algorithm for the classification of data by using extreme learning machines”, Turkish Journal of Electrical Engineering & Computer Sciences, 27(2), 1523-1533, 2019.
  • T. Dokeroglu, E. Sevinc, “Evolutionary parallel extreme learning machines for the data classification problem”, Computers & Industrial Engineering, 130, 237-249, 2019.
  • X. Zeng, D. S. Yeung, “Hidden neuron pruning of multilayer perceptrons using a quantified sensitivity measure”, Neurocomputing, 69(7-9), 825-837, 2006.
  • M. G. Augasta, T. Kathirvalavakumar, “A novel pruning algorithm for optimizing feedforward neural network of classification problems”, Neural processing letters, 34(3), 241, 2011.
  • C. R. Rao, S. K. Mitra, Generalized Inverse of Matrices and its Applications, John Wiley and Sons Inc., New York, USA, 1971.
  • Y. Lan, Y. C. Soh, G. B. Huang, “Two-stage extreme learning machine for regression”, Neurocomputing, 73, 3028-3038, 2010.
  • C. Hamzaçebi, F. Kutay, "Durağan Zaman Serilerinin Yapay Sinir Ağları Ile Tahmininde Girdi Nöronu Ve Gizli Nöron Sayısının Belirlenmesi", TÜİK İstatistik Araştırma Dergisi, 4, 2005.
  • R. Kohavi, “A study of cross-validation and bootstrap for accuracy estimation and model selection”, IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence, vol. 2, 1137-1143, 1995.
  • M. H. Calp, “İşletmeler için personel yemek talep miktarının yapay sinir ağları kullanılarak tahmin edilmesi”, Politeknik Dergisi, 22(3): 675-686, (2019).
Toplam 24 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Bilgisayar Yazılımı
Bölüm Makaleler
Yazarlar

Ender Sevinç 0000-0001-7670-722X

Yayımlanma Tarihi 29 Ekim 2019
Gönderilme Tarihi 30 Eylül 2018
Yayımlandığı Sayı Yıl 2019 Cilt: 12 Sayı: 4

Kaynak Göster

APA Sevinç, E. (2019). The Effect of Hidden Neurons in Single-Hidden Layer Feedforward Neural Networks. Bilişim Teknolojileri Dergisi, 12(4), 277-286. https://doi.org/10.17671/gazibtd.465886