Research Article
BibTex RIS Cite

Fonksiyon yaklaşımı probleminde esnek küçük-dünya ağlarının topolojik değişiminin performansa etkisi

Year 2020, , 1398 - 1406, 30.09.2020
https://doi.org/10.31202/ecjse.787819

Abstract

İleri yönlü yapay sinir ağları performansı topolojiyle doğrudan ilişkilidir. Son yıllarda, topolojik değişikliklerle ağ performansının artırılması üzerine yapılan çalışmalar çok önemli bir süreç haline gelmiştir. Bu nedenle, nöronlar arası sinaptik bağlantıların değiştirildiği bir yaklaşım olan küçük-dünya ağ modeli gelecek çalışmalar için önemli bir yere sahiptir. Bu çalışmada, fonksiyon yaklaşımı problemleri için esnek küçük-dünya ağ modelinin katmansal ve nöronal değişimlerle performansının nasıl değiştiği araştırılmıştır. Elde edilen sonuçlar gösterdi ki, geleneksel esnek yapay sinir ağı topolojisine göre esnek küçük-dünya ağının performansı çok daha fazla iyileştirdiği ve katman sayısının artırılmasının ağ performansına ciddi katkı sağladığı gözlemlenmiştir. Diğer taraftan, katman sayısı ile birlikte gizli katman nöron sayısının artırıldıkça performansa negatif etki ettiği görülmüştür. Bu bağlamda, çoklu gizli katmana sahip esnek küçük-dünya ağlarının fonksiyon yaklaşımı probleminin çözümünde pozitif etki ettiği gösterilmiş ve özellikle bu modelin gelecekte derin nöron ağları performansının geliştirilmesinde anahtar rol oynayacağı vurgulanmıştır.

References

  • [1] Haykin, S., Neural networks—a comprehensive foundation, (2nd Edition), Prentice-Hall, Englewood Cliffs, NJ., 1999.
  • [2] Magnitskii N A., Some New Approaches to the Construction and Learning of Artificial neural Networks, Computational Mathematics and Modeling, 2001,12, 293-304.
  • [3] Sun, M., Stam, A., Steuer, RE., Solving Multiple Objective Programming Problems Using Feed-Forward Artificial Neural Networks: The Interactive FFANN Procedure, Man. Sci., 1996, 42(6), 835-849.
  • [4] Erkaymaz, O., Ozer, M & Yumusak, N., Impact of small-world topology on the performance of a feed-forward artificial neural network based on 2 different real-life problems, Turk. J. Elec. Eng. & Comp. Sci., 2014, 22, 708-718.
  • [5] Riedmiller, M. and Braun, H., A direct adaptive method for faster backpropagation learning: The RPROP Algorithm, Proceedings of the IEEE International Conference on Neural Networks. 1993, 586-591.
  • [6] Riedmiller, M., Braun, H., Neural speed controller trained online by means of modified rprop algorithm, IEEE Trans. Ind. Inform, 2015, 11, 586–591.
  • [7] Shrestha, S. B. & Song, Q., Robust learning in SpikeProp. Neural Networks, 2017, 86, 54-68.
  • [8] Yilmaz, E., Baysal, V., Ozer, M., Perc, M. Autaptic pacemaker mediated propagation of weak rhythmic activity across small-world neuronal networks, Phys. A, 2016, 444, 538-546.
  • [9] Yilmaz, E., Baysal, V., Perc, M., Ozer, M., Enhancement of pacemaker induced stochastic resonance by an autapse in a scale-free neuronal network, Sci. China Tech. Sci., 2016, 59(3),364-370.
  • [10] Watts, D. J. & Strogatz, S. H., Collective dynamics of 'small-world' networks, Nature, 1999, 393, 409-10.
  • [11] Watts D. J., Small Worlds: The Dynamics of networks between order and randomness, Princeton, NJ:Princeton University Press, 2003.
  • [12] Newman, M. E. J. & Watts, D. J., Scaling and percolation in the small-world network model, Phys. Rev. E, 1999, 60, 7332-7342.
  • [13] Kawai, Y., Park, J. & Asada, M., A small-world topology enhances the echo state property and signal propagation in reservoir computing, Neural Networks, 2019, 112, 15-23.
  • [14] Barabasi, AL., Albert, R., Emergence of Scaling in Random Networks, Science, 1999, 286, 509-512.
  • [15] Simard, D., Nadeau, L. & Kröger, H., Fastest learning in small-world neural networks, Physics Letters A., 2005, 336(1),8-15.
  • [16] Shuzhong, Y., Siwei, L., Jianyu, Li., Building Multi-layer Small World Neural Network, Lect. Note in Comp. Sci. Ser., 2006, 3971, 695-700.
  • [17] Erkaymaz, O., Ozer, M., Impact of small-world network topology on the conventional artificial neural network for the diagnosis of diabetes. Chaos Sol. and Fract., 2016, 83, 178-185.
  • [18] Erkaymaz, O., Ozer, M., Impact of Newman-Watts Small-World approach on The Performance of Feed-Forward Artificial Neural Networks, Karaelmas Science and Engineering Journal., 2016, 6(1), 187-194.
  • [19] Erkaymaz, O., Ozer, M. & Perc, M., Performance of small-world feedforward neural networks for the diagnosis of diabetes, Appl. Math Comput., 2017, 311, 22-28.
  • [20] Erkaymaz, O., Resilient back-propagation approach in small-world feed-forward neural network topology based on Newman–Watts algorithm, Neural Comput & Applic., 2020, https://doi.org/10.1007/s00521-020-05161-6.
  • [21] Latora, V. & Marchiori, M. (2001). Efficient behavior of small-world networks, Phys. Rev. Letter., 2001, 87(19), 98701.

Effect of topological changes of resilient small-world networks on performance in the function approach problem

Year 2020, , 1398 - 1406, 30.09.2020
https://doi.org/10.31202/ecjse.787819

Abstract

Performance of Feed forward neural networks is directly related to topology. In recent years, studies on improving performance of network with topological changes have become a very important process. Therefore, the small-world network model, an approach in which synaptic connections between neurons are altered, has an important place for future studies. In this study, how the performance of the resilient small-world network model changes with layered and neuronal revisions has been investigated for function approach problems. The results show that the resilient small-world network improves the performance much more than the traditional resilient artificial neural network topology and it has been observed that increasing the number of layers significantly contributes to the network performance. On the other hand, It has been observed that when the number of hidden layer neurons is increased with the number of layers, the performance has a negative effect. In this context, it has been shown that resilient small-world networks with multiple hidden layers have a positive effect on the solution of the function approach problem and also it is emphasized that this model will play a key role in improving the performance of deep neuron networks in the future.

References

  • [1] Haykin, S., Neural networks—a comprehensive foundation, (2nd Edition), Prentice-Hall, Englewood Cliffs, NJ., 1999.
  • [2] Magnitskii N A., Some New Approaches to the Construction and Learning of Artificial neural Networks, Computational Mathematics and Modeling, 2001,12, 293-304.
  • [3] Sun, M., Stam, A., Steuer, RE., Solving Multiple Objective Programming Problems Using Feed-Forward Artificial Neural Networks: The Interactive FFANN Procedure, Man. Sci., 1996, 42(6), 835-849.
  • [4] Erkaymaz, O., Ozer, M & Yumusak, N., Impact of small-world topology on the performance of a feed-forward artificial neural network based on 2 different real-life problems, Turk. J. Elec. Eng. & Comp. Sci., 2014, 22, 708-718.
  • [5] Riedmiller, M. and Braun, H., A direct adaptive method for faster backpropagation learning: The RPROP Algorithm, Proceedings of the IEEE International Conference on Neural Networks. 1993, 586-591.
  • [6] Riedmiller, M., Braun, H., Neural speed controller trained online by means of modified rprop algorithm, IEEE Trans. Ind. Inform, 2015, 11, 586–591.
  • [7] Shrestha, S. B. & Song, Q., Robust learning in SpikeProp. Neural Networks, 2017, 86, 54-68.
  • [8] Yilmaz, E., Baysal, V., Ozer, M., Perc, M. Autaptic pacemaker mediated propagation of weak rhythmic activity across small-world neuronal networks, Phys. A, 2016, 444, 538-546.
  • [9] Yilmaz, E., Baysal, V., Perc, M., Ozer, M., Enhancement of pacemaker induced stochastic resonance by an autapse in a scale-free neuronal network, Sci. China Tech. Sci., 2016, 59(3),364-370.
  • [10] Watts, D. J. & Strogatz, S. H., Collective dynamics of 'small-world' networks, Nature, 1999, 393, 409-10.
  • [11] Watts D. J., Small Worlds: The Dynamics of networks between order and randomness, Princeton, NJ:Princeton University Press, 2003.
  • [12] Newman, M. E. J. & Watts, D. J., Scaling and percolation in the small-world network model, Phys. Rev. E, 1999, 60, 7332-7342.
  • [13] Kawai, Y., Park, J. & Asada, M., A small-world topology enhances the echo state property and signal propagation in reservoir computing, Neural Networks, 2019, 112, 15-23.
  • [14] Barabasi, AL., Albert, R., Emergence of Scaling in Random Networks, Science, 1999, 286, 509-512.
  • [15] Simard, D., Nadeau, L. & Kröger, H., Fastest learning in small-world neural networks, Physics Letters A., 2005, 336(1),8-15.
  • [16] Shuzhong, Y., Siwei, L., Jianyu, Li., Building Multi-layer Small World Neural Network, Lect. Note in Comp. Sci. Ser., 2006, 3971, 695-700.
  • [17] Erkaymaz, O., Ozer, M., Impact of small-world network topology on the conventional artificial neural network for the diagnosis of diabetes. Chaos Sol. and Fract., 2016, 83, 178-185.
  • [18] Erkaymaz, O., Ozer, M., Impact of Newman-Watts Small-World approach on The Performance of Feed-Forward Artificial Neural Networks, Karaelmas Science and Engineering Journal., 2016, 6(1), 187-194.
  • [19] Erkaymaz, O., Ozer, M. & Perc, M., Performance of small-world feedforward neural networks for the diagnosis of diabetes, Appl. Math Comput., 2017, 311, 22-28.
  • [20] Erkaymaz, O., Resilient back-propagation approach in small-world feed-forward neural network topology based on Newman–Watts algorithm, Neural Comput & Applic., 2020, https://doi.org/10.1007/s00521-020-05161-6.
  • [21] Latora, V. & Marchiori, M. (2001). Efficient behavior of small-world networks, Phys. Rev. Letter., 2001, 87(19), 98701.
There are 21 citations in total.

Details

Primary Language Turkish
Subjects Engineering
Journal Section Makaleler
Authors

Okan Erkaymaz 0000-0002-1996-8623

Publication Date September 30, 2020
Submission Date August 30, 2020
Acceptance Date September 12, 2020
Published in Issue Year 2020

Cite

IEEE O. Erkaymaz, “Fonksiyon yaklaşımı probleminde esnek küçük-dünya ağlarının topolojik değişiminin performansa etkisi”, ECJSE, vol. 7, no. 3, pp. 1398–1406, 2020, doi: 10.31202/ecjse.787819.