Artificial Neural Network (ANN) learns from inputs and outputs. The values of the weights and biases in ANN are updated according to inputs and outputs. Researchers have proposed algorithms to train Multi-Layer Perceptron (MLP). However, classical techniques often face problems in solving this optimization problem. They tend to need large amounts of computing time, large amounts of memory. More importantly, they get stuck within the local optimum and produce poor-quality solutions. To overcome these difficulties, meta-heuristic algorithms have been used to train MLP. In this article, the Butterfly Optimization Algorithm (BOA) which was designed by modeling the behaviors of butterflies was used for the first time to train the multi-layer perceptron. The developed algorithm was named BOA-MLP where the BOA algorithm optimized the values of the weights and biases in the MLP. The success of the BOA-MLP algorithm was tested on five data sets (iris, breast cancer, heart, balloon and xor) which are frequently used in the literature. In the experiments, the BOA-MLP algorithm was compared with the BAT-MLP, SMS-MLP and BP algorithms. The average and standard deviation of the mean squared error, the average classification accuracy, the sensitivity, the specificity, the precision and the F1-score were used as the performance metrics. According to the experimental results, it is seen that the BOA-MLP algorithm surpasses the BAT-MLP, SMS-MLP and BP algorithms on all data sets and shows superior success.
artificial neural networks butterfly optimization algorithm multi-layer perceptron optimization training of artificial neural networks
Birincil Dil | İngilizce |
---|---|
Konular | Mühendislik |
Bölüm | Araştırma Makalesi |
Yazarlar | |
Yayımlanma Tarihi | 6 Aralık 2021 |
Yayımlandığı Sayı | Yıl 2021 |
Manas Journal of Engineering