Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2021, Cilt: 9 Sayı: 2, 160 - 168, 06.12.2021
https://doi.org/10.51354/mjen.917837

Öz

Kaynakça

  • Jaddi N.S., Abdullah S., “Optimization of neural network using kidney-inspired algorithm with control of filtration rate and chaotic map for real-world rainfall forecasting.” Engineering Applications of Artificial Intelligence, 67, (2018), 246-259.
  • Türkoğlu B., “Artificial algae algori̇thm on training artificial neural networks.” 2019, Selcuk University Natural Science Institute.
  • Haupt R.L., Ellen Haupt S., Practical genetic algorithms. 2004.
  • Mirjalili S., “How effective is the Grey Wolf optimizer in training multi-layer perceptrons.” Applied Intelligence, 43(1), (2015), 150-161.
  • Kulluk S., Ozbakir L., Baykasoglu A., “Training neural networks with harmony search algorithms for classification problems.” Engineering Applications of Artificial Intelligence, 25(1), (2012), 11-19.
  • Ghaleini E.N., et al., “A combination of artificial bee colony and neural network for approximating the safety factor of retaining walls.” Engineering with Computers, 35(2), (2019), 647-658.
  • Tang R., Fong S., Deb Ss, Vasilakos A.V., Millham R.C., “Dynamic group optimisation algorithm for training feed-forward neural networks.” Neurocomputing, 314, (2018), 1-19.
  • Zhang L., Suganthan P.N., “A survey of randomized algorithms for training neural networks.” Information Sciences, 364, (2016), 146-155.
  • Ojha V.K., Abraham A., Snášel V., “Metaheuristic design of feedforward neural networks: A review of two decades of research.” Engineering Applications of Artificial Intelligence, 60, (2017), 97-116.
  • Hacibeyoglu M., Ibrahim M.H., “A novel multimean particle swarm optimization algorithm for nonlinear continuous optimization: application to feed-forward neural network training.” Scientific Programming. 2018.
  • Gulcu Ş., “Training of the Artificial Neural Networks using States of Matter Search Algorithm.” International Journal of Intelligent Systems and Applications in Engineering, 8(3), (2020), 131-136.
  • Arora S.,. Singh S, “Butterfly optimization algorithm: a novel approach for global optimization.” Soft Computing, 23(3), (2019), 715-734.
  • Tümer A., Edebali S., Gülcü Ş., “Modeling of Removal of Chromium (VI) from Aqueous Solutions Using Artificial Neural Network.” Iranian Journal of Chemistry and Chemical Engineering (IJCCE), 39(1), (2020), 163-175.
  • Madenci E., Gülcü Ş., “Optimization of flexure stiffness of FGM beams via artificial neural networks by mixed FEM.” Structural Engineering and Mechanics, 75(5), (2020), 633-642.
  • Karaşahin A.T., Tümer A.E., “Real time traffic signal timing approach based on artificial neural network.” MANAS Journal of Engineering, 8(1), (2020), 49-54.
  • Pandya A., Macy (1996) “Pattern Recognition with Neural Network in C++.” CRC Press, Florida.
  • Keskenler M.F., Keskenler E.F., “From Past to Present Artificial Neural Networks and History” Takvim-i Vekayi, 5(2), (2017), 8-18.
  • Hamzaçebi C., “Yapay sinir ağları: tahmin amaçlı kullanımı MATLAB ve Neurosolutions uygulamalı.” 2011: Ekin Basım Yayın Dağıtım.
  • Belew R.K., McInerney J., Schraudolph N.N., “Evolving Networks: Using the Genetic Algorithm.” 1990.
  • Aljarah I., Faris H., Mirjalili S., “Optimizing connection weights in neural networks using the whale optimization algorithm.” Soft Computing, 22(1), (2018), 1-15.
  • Yang X.-S., “A new metaheuristic bat-inspired algorithm,” Nature inspired cooperative strategies for optimization (NICSO 2010)., Springer, (2010), 65-74.

Training of the feed-forward artificial neural networks using butterfly optimization algorithm

Yıl 2021, Cilt: 9 Sayı: 2, 160 - 168, 06.12.2021
https://doi.org/10.51354/mjen.917837

Öz

Artificial Neural Network (ANN) learns from inputs and outputs. The values of the weights and biases in ANN are updated according to inputs and outputs. Researchers have proposed algorithms to train Multi-Layer Perceptron (MLP). However, classical techniques often face problems in solving this optimization problem. They tend to need large amounts of computing time, large amounts of memory. More importantly, they get stuck within the local optimum and produce poor-quality solutions. To overcome these difficulties, meta-heuristic algorithms have been used to train MLP. In this article, the Butterfly Optimization Algorithm (BOA) which was designed by modeling the behaviors of butterflies was used for the first time to train the multi-layer perceptron. The developed algorithm was named BOA-MLP where the BOA algorithm optimized the values of the weights and biases in the MLP. The success of the BOA-MLP algorithm was tested on five data sets (iris, breast cancer, heart, balloon and xor) which are frequently used in the literature. In the experiments, the BOA-MLP algorithm was compared with the BAT-MLP, SMS-MLP and BP algorithms. The average and standard deviation of the mean squared error, the average classification accuracy, the sensitivity, the specificity, the precision and the F1-score were used as the performance metrics. According to the experimental results, it is seen that the BOA-MLP algorithm surpasses the BAT-MLP, SMS-MLP and BP algorithms on all data sets and shows superior success.

Kaynakça

  • Jaddi N.S., Abdullah S., “Optimization of neural network using kidney-inspired algorithm with control of filtration rate and chaotic map for real-world rainfall forecasting.” Engineering Applications of Artificial Intelligence, 67, (2018), 246-259.
  • Türkoğlu B., “Artificial algae algori̇thm on training artificial neural networks.” 2019, Selcuk University Natural Science Institute.
  • Haupt R.L., Ellen Haupt S., Practical genetic algorithms. 2004.
  • Mirjalili S., “How effective is the Grey Wolf optimizer in training multi-layer perceptrons.” Applied Intelligence, 43(1), (2015), 150-161.
  • Kulluk S., Ozbakir L., Baykasoglu A., “Training neural networks with harmony search algorithms for classification problems.” Engineering Applications of Artificial Intelligence, 25(1), (2012), 11-19.
  • Ghaleini E.N., et al., “A combination of artificial bee colony and neural network for approximating the safety factor of retaining walls.” Engineering with Computers, 35(2), (2019), 647-658.
  • Tang R., Fong S., Deb Ss, Vasilakos A.V., Millham R.C., “Dynamic group optimisation algorithm for training feed-forward neural networks.” Neurocomputing, 314, (2018), 1-19.
  • Zhang L., Suganthan P.N., “A survey of randomized algorithms for training neural networks.” Information Sciences, 364, (2016), 146-155.
  • Ojha V.K., Abraham A., Snášel V., “Metaheuristic design of feedforward neural networks: A review of two decades of research.” Engineering Applications of Artificial Intelligence, 60, (2017), 97-116.
  • Hacibeyoglu M., Ibrahim M.H., “A novel multimean particle swarm optimization algorithm for nonlinear continuous optimization: application to feed-forward neural network training.” Scientific Programming. 2018.
  • Gulcu Ş., “Training of the Artificial Neural Networks using States of Matter Search Algorithm.” International Journal of Intelligent Systems and Applications in Engineering, 8(3), (2020), 131-136.
  • Arora S.,. Singh S, “Butterfly optimization algorithm: a novel approach for global optimization.” Soft Computing, 23(3), (2019), 715-734.
  • Tümer A., Edebali S., Gülcü Ş., “Modeling of Removal of Chromium (VI) from Aqueous Solutions Using Artificial Neural Network.” Iranian Journal of Chemistry and Chemical Engineering (IJCCE), 39(1), (2020), 163-175.
  • Madenci E., Gülcü Ş., “Optimization of flexure stiffness of FGM beams via artificial neural networks by mixed FEM.” Structural Engineering and Mechanics, 75(5), (2020), 633-642.
  • Karaşahin A.T., Tümer A.E., “Real time traffic signal timing approach based on artificial neural network.” MANAS Journal of Engineering, 8(1), (2020), 49-54.
  • Pandya A., Macy (1996) “Pattern Recognition with Neural Network in C++.” CRC Press, Florida.
  • Keskenler M.F., Keskenler E.F., “From Past to Present Artificial Neural Networks and History” Takvim-i Vekayi, 5(2), (2017), 8-18.
  • Hamzaçebi C., “Yapay sinir ağları: tahmin amaçlı kullanımı MATLAB ve Neurosolutions uygulamalı.” 2011: Ekin Basım Yayın Dağıtım.
  • Belew R.K., McInerney J., Schraudolph N.N., “Evolving Networks: Using the Genetic Algorithm.” 1990.
  • Aljarah I., Faris H., Mirjalili S., “Optimizing connection weights in neural networks using the whale optimization algorithm.” Soft Computing, 22(1), (2018), 1-15.
  • Yang X.-S., “A new metaheuristic bat-inspired algorithm,” Nature inspired cooperative strategies for optimization (NICSO 2010)., Springer, (2010), 65-74.
Toplam 21 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Mühendislik
Bölüm Araştırma Makalesi
Yazarlar

Büşra Irmak Bu kişi benim 0000-0003-0972-7954

Şaban Gülcü 0000-0001-7714-8861

Yayımlanma Tarihi 6 Aralık 2021
Yayımlandığı Sayı Yıl 2021 Cilt: 9 Sayı: 2

Kaynak Göster

APA Irmak, B., & Gülcü, Ş. (2021). Training of the feed-forward artificial neural networks using butterfly optimization algorithm. MANAS Journal of Engineering, 9(2), 160-168. https://doi.org/10.51354/mjen.917837
AMA Irmak B, Gülcü Ş. Training of the feed-forward artificial neural networks using butterfly optimization algorithm. MJEN. Aralık 2021;9(2):160-168. doi:10.51354/mjen.917837
Chicago Irmak, Büşra, ve Şaban Gülcü. “Training of the Feed-Forward Artificial Neural Networks Using Butterfly Optimization Algorithm”. MANAS Journal of Engineering 9, sy. 2 (Aralık 2021): 160-68. https://doi.org/10.51354/mjen.917837.
EndNote Irmak B, Gülcü Ş (01 Aralık 2021) Training of the feed-forward artificial neural networks using butterfly optimization algorithm. MANAS Journal of Engineering 9 2 160–168.
IEEE B. Irmak ve Ş. Gülcü, “Training of the feed-forward artificial neural networks using butterfly optimization algorithm”, MJEN, c. 9, sy. 2, ss. 160–168, 2021, doi: 10.51354/mjen.917837.
ISNAD Irmak, Büşra - Gülcü, Şaban. “Training of the Feed-Forward Artificial Neural Networks Using Butterfly Optimization Algorithm”. MANAS Journal of Engineering 9/2 (Aralık 2021), 160-168. https://doi.org/10.51354/mjen.917837.
JAMA Irmak B, Gülcü Ş. Training of the feed-forward artificial neural networks using butterfly optimization algorithm. MJEN. 2021;9:160–168.
MLA Irmak, Büşra ve Şaban Gülcü. “Training of the Feed-Forward Artificial Neural Networks Using Butterfly Optimization Algorithm”. MANAS Journal of Engineering, c. 9, sy. 2, 2021, ss. 160-8, doi:10.51354/mjen.917837.
Vancouver Irmak B, Gülcü Ş. Training of the feed-forward artificial neural networks using butterfly optimization algorithm. MJEN. 2021;9(2):160-8.

Manas Journal of Engineering 

16155