Research Article
BibTex RIS Cite

Year 2025, Volume: 13 Issue: 4, 1137 - 1157, 01.12.2025
https://doi.org/10.36306/konjes.1679144

Abstract

References

  • A. Gülcü, Z. Kuş, “Konvolüsyonel sinir ağlarında hiper-parametre optimizasyonu yöntemlerinin incelenmesi,” Gazi University Journal of Science Part C: Design and Technology, 7(2), 503-522, 2019.
  • E. Öztemel, Yapay Sinir Ağları. Istanbul: Papatya Yayincilik, 2003.
  • J. Bergstra, R. Bardenet, Y. Bengio, B. Kégl, (2011). “Algorithms for hyper-parameter optimization,” in Proceedings of the 24th International Conference on Neural Information Processing Systems NIPS 2011, 2546–2554.
  • M. Lindauer, K. Eggensperger, M. Feurer, A. Biedenkapp, D. Deng, C. Benjamins, T. Ruhkopf, R. Sass, and F. Hutter, “SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization,” The Journal of Machine Learning Research., 23(1), 2022.
  • L. Li, K. Jamieson, A. Rostamizadeh, and A. Talwalkar, “Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization,” Journal of Machine Learning Research, 18(1), 6765–6816, 2017.
  • O.M. Khanday, S. Dadvandipour, M.A. Lone, “Effect of filter sizes on image classification in CNN: A case study on CFIR10 and fashion-MNIST datasets,” IAES International Journal of Artificial Intelligence, 10 (4), 872–878, 2021.
  • H. Shao, E. Ma, M. Zhu, X. Deng, S. Zhai, “MNIST Handwritten Digit Classification Based on Convolutional Neural Network with Hyperparameter Optimization,” Intelligent Automation & Soft Computing, 36(3), 2023.
  • B. Bischl, M. Binder, M. Lang, T. Pielok, J. Richter, S. Coors, J. Thomas, T. Ullmann, M. Becker, A. Boulesteix, D. Deng, M. Lindauer, “Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges,” Wiley Interdiscip. Rev. Data Min. Knowl. Discov., vol. 13, 2023.
  • T. Yigit, S. Atmaca, R. Gurfidan, R. Colak, “ Hyper Parameter Analysis in Recognition of Handwritten Digits Using Convolutional Neural Network,” Gazi Journal of Engineering Sciences, 9(4), 268-277, 2017
  • H. Orhan, D. S. Polat, H. Hakli, H. “Farkli Çaprazlama Teknikleri Kullanan Dağinik Arama Algoritmasi Ile Evrişimli Sinir Ağlarinda Hiper Parametre Optimizasyonu,” Kahramanmaraş Sütçü İmam Üniversitesi Mühendislik Bilimleri Dergisi, 27(4), 1437-1450, 2024.
  • Xiao, X., Yan, M., Basodi, S., Ji, C., Pan, Y. (2020). Efficient hyperparameter optimization in deep learning using a variable length genetic algorithm. arXiv preprint arXiv:2006.12703.
  • E. Tanyıldızı, F. Demirtaş, (2019). “Hiper Parametre Optimizasyonu Hyper Parameter Optimization,” Paper presented at the 2019 1st International Informatics and Software Engineering Conference (UBMYK).
  • E. Özbay, F. A. Özbay, “Parçacık Sürüsü Optimizasyon Algoritması ile Optimize Edilmiş Evrişimsel Sinir Ağı Kullanılarak Dermoskopik Görüntülerden Cilt Kanserinin Sınıflandırılması,” Fırat Üniversitesi Mühendislik Bilimleri Dergisi, 35(1), 261-273, 2023.
  • Ö. İnik, “CNN hyper-parameter optimization for environmental sound classification,” Applied Acoustics, 202, 109168, 2023.
  • M.E. Kıymaç, (2022). “Hyper-parameter optimization of deep neural networks with metaheuristic algorithms,” Yüksek Lisans Tezi, Alparslan Türkeş Bilim ve Teknoloji Üniversitesi.
  • M. Yurdakul, (2022). “Meta-sezgisel algoritmalar ile konvolüsyonel sinir ağı mimarisinin hiper parametrelerinin optimizasyonu,” Yüksek Lisans Tezi, Selçuk Üniversitesi.
  • G. Atteia, N. Abdel Samee, E. S. M. El-Kenawy, A. Ibrahim, “CNN-hyperparameter optimization for diabetic maculopathy diagnosis in optical coherence tomography and fundus retinography,” Mathematics, 10(18), 3274, 2022.
  • M. Ait Amou, K. Xia, S. Kamhi, M. Mouhafid, “A novel MRI diagnosis method for brain tumor classification based on CNN and Bayesian Optimization,” In Healthcare, 10(3), 494, 2022.
  • H. ZainEldin, S. A. Gamel, E. S. M. El-Kenawy, A. H. Alharbi, D. S. Khafaga, A. Ibrahim, F. M. Talaat, “Brain tumor detection and classification using deep learning and sine-cosine fitness grey wolf optimization,” Bioengineering, 10(1), 18, 2022.
  • M. Sethi, S. Ahuja, S. Rani, P. Bawa, A. Zaguia, A. “Classification of Alzheimer’s Disease Using Gaussian‐Based Bayesian Parameter Optimization for Deep Convolutional LSTM Network,” Computational and Mathematical Methods in Medicine, 2021(1), 4186666, 2021.
  • M. Zhang, H. Li, S. Pan, J. Lyu, S. Ling, S., Su, “Convolutional neural networks-based lung nodule classification: A surrogate-assisted evolutionary algorithm for hyperparameter optimization,” IEEE Transactions on Evolutionary Computation, 25(5), 869-882, 2021.
  • K. Shankar, Y. Zhang, Y., Liu, L. Wu, C. H. Chen, “Hyperparameter tuning deep learning for diabetic retinopathy fundus image classification,” IEEE Access, 8, 118164-118173, 2020.
  • L. M. Seng, B. Bang Chen Chiang, Z. Arabee Abdul Salam, G. Yih Tan, H. Tong Chai, “MNIST handwritten digit recognition with different CNN architectures,” Journal of Applied Technology and Innovation, 5(1), 7-10, 2021.
  • C. Zhong, G. Li, Z. Meng, H. Li, A. R. Yildiz, S. Mirjalili, S. “Starfish optimization algorithm (SFOA): a bio-inspired metaheuristic algorithm for global optimization compared with 100 optimizers,” Neural Computing and Applications, 37(5), 3641-3683, 2025.
  • D. Izci, S. Ekinci, M. Jabari, M. Bajaj, V. Blazek, L. Prokop, ... & S. Mirjalili, “A new intelligent control strategy for CSTH temperature regulation based on the starfish optimization algorithm,” Scientific Reports, 15(1), 12327, 2025.
  • S. S. Kadam, A. C. Adamuthe, A. B. Patil, “CNN model for image classification on MNIST and fashion-MNIST dataset,” Journal of scientific research, 64(2), 374-384, 2020.
  • S. Ahlawat, A. Choudhary, A. Nayyar, S. Singh, & B. Yoon, “Improved handwritten digit recognition using convolutional neural networks (CNN),” Sensors, 20(12), 3344, 2020.
  • A. Jain & B. K. Sharma, “Analysis of activation functions for Convolutional Neural Network based mnist handwritten character recognition,” International journal of advanced studies of scientific research, 3(9), 2018.
  • A. A. Yahya, J. Tan, & M. Hu, “A novel handwritten digit classification system based on convolutional neural network approach,” Sensors, 21(18), 6273, 2021.
  • G. Gilanie, H. Shafiq, S. N. Batool, S. N. Abbas, H. Shafique, S. Cheema, ... & M. Ahsan, “PARAMETER OPTIMIZATION OF AUTOENCODER FOR IMAGE CLASSIFICATION USING GENETIC ALGORITHM,” Spectrum of Engineering Sciences, 3(4), 201-213, 2025.
  • S. Kiliçarslan, S. “PSO+ GWO: a hybrid particle swarm optimization and Grey Wolf optimization based Algorithm for fine-tuning hyper-parameters of convolutional neural networks for Cardiovascular Disease Detection,” Journal of Ambient Intelligence and Humanized Computing, 14(1), 87-97, 2023.
  • B.S. Park, S. Lee, Y.H. Seo, “Training Method for Enhancing Classification Accuracy of Kuzushiji-MNIST/49 using Deep Learning based on CNN,” Journal of the Korea Institute of Information and Communication Engineering, 24(3), 355-363, 2020.
  • S. Zhao, Q. Fan, Q. Dong, Z. Xing, X. Yang, X. He, “Efficient construction and convergence analysis of sparse convolutional neural networks,” Neurocomputing, 597, 128032, 2024.
  • B. T. Kettle and J. S. Lucas, “Biometric relationships between organ indices, fecundity, oxygen consumption and body size in Acanthaster planci (L.) (Echinodermata; Asteroidea),” Bulletin of Marine Science, 41(2), 541–551, 1987.
  • P. O. Ottesen and J. S. Lucas, “Divide or broadcast: interrelation of asexual and sexual reproduction in a population of the fissiparous hermaphroditic seastar Nepanthia belcheri (Asteroidea: Asterinidae),” Marine Biology, 69, 223–233, 1982.
  • M. Güler and A. Lök, “Foragig behaviors of sea stars, Marthasterias glacialis and Astropecten aranciacus (Asteroidea) and predator-prey interactions with warty venus clam, Venus verrucose (Bivalvia),” Journal of Experimental Marine Biology and Ecology, 465, 99–106, 2015.
  • Y. Hayashi and T. Motokawa, “Effects of lonic environment on viscosity of catch connective tissue in Holothurian body wall,” Journal of experimental biology, 125(1), 71–84, 1986.
  • Y. LeCun, B. E. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. E. Hubbard, L. D. Jackel, “Handwritten digit recognition with a back-propagation network,” Advances in neural information processing systems, 2, 1990.
  • Y. LeCun, L. Bottou, Y. Bengio, P. Haffner, “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, 86 (11), 2278-2324, 1998.
  • I. Goodfellow, Y. Bengio, A. Courville, Y. Bengio. Deep learning (Vol. 1). Cambridge: MIT press, 2016.
  • M. Tan, C. Emeksiz, “Hydrogen fuel cell parameter estimation using an innovative hybrid estimation model based on deep learning and probability pooling,” International Journal of Hydrogen Energy, 110, 445-456, 2024.
  • M. Braik, “Enhanced ali baba and the forty thieves algorithm for feature selection,” Neural Computing and Applications, 35(8), 6153–6184, 2023.
  • D. Albashish, A. I. Hammouri, M. Braik, J. Atwan, S. Sah ran, “Binary biogeography-based optimization based SVM-RFE for feature selection,” Applied Soft Computing, 101, 107026–107045, 2021.
  • E. A. Mohamed, M. S. Braik, M. A. Al-Betar, M. A. Awadallah, “Boosted spider wasp optimizer for high-dimensional feature selection,” Journal of Bionic Engineering, 21(5), 2424-2459, 2024.
  • J. C. Geng, Z. Cui, X. S. Gu, “Scatter search based particle swarm optimization algorithm for earliness/tardiness flowshop scheduling with uncertainty,” International Journal of Automation and Computing, 13(3), 285-295, 2016.
  • R. B. Wang, R. B. Hu, F. D. Geng, L. Xu, S. C. Chu, J. S. Pan, ... & S. Mirjalili, “The Animated Oat Optimization Algorithm: A nature-inspired metaheuristic for engineering optimization and a case study on Wireless Sensor Networks,” Knowledge-Based Systems, 113589, 2025.
  • A. Kumar, I. Singh, B. Singh. Classification of MNIST Dataset Using Different CNN Architecture. In Intelligent Circuits and Systems for SDG 3–Good Health and well-being (pp. 273-280). CRC Press, 2024.
  • F. M. Talaat and S.A. Gamel, “RL based hyper-parameters optimization algorithm (ROA) for convolutional neural network,” Journal of Ambient Intelligence and Humanized Computing, 14(10), 13349-13359, 2023.
  • U. Erkan, A. Toktas, D. Ustun, “Hyperparameter optimization of deep CNN classifier for plant species identification using artificial bee colony algorithm,” Journal of Ambient Intelligence and Humanized Computing, 14(7), 8827-8838, 2023.
  • L. C. Ribeiro, G. H. D. Rosa, D. Rodrigues, J. P. Papa, “Convolutional neural networks ensembles through single-iteration optimization,” Soft Computing, 26(8), 3871-3882, 2022.
  • A. Ghosh, A., Mukherjee, C. Ghosh, (2020). “Simplistic deep learning for Japanese handwritten digit recognition,” In Intelligent Techniques and Applications in Science and Technology: Proceedings of the First International Conference on Innovations in Modern Science and Technology 1, (pp. 87-93). Springer International Publishing.
  • H. Qiu, and J. Dong, “A robust residual shrinkage balanced network for image recognition from Japanese historical documents,” Journal of Sensors, 2023(1), 8316638, 2023.
  • T. Erkoç, M. T. Eskıl, “A novel similarity based unsupervised technique for training convolutional filters,” IEEE Access, 11, 49393-49408, 2023.
  • C. A. da Silva, D. C. Rosa, P. B. Miranda, F. R. Cordeiro, T. Si, A. C. Nascimento, ... & P. S de Mattos Neto, “A novel multi-objective grammar-based framework for the generation of convolutional neural networks,” Expert Systems With Applications, 212, 118670, 2023.

THE STUDY OF THE HYPERPARAMETER OPTIMIZATION ANALYSIS IN THE CONVOLUTIONAL NEURAL NETWORKS MODEL WITH STARFISH OPTIMIZATION ALGORITHM

Year 2025, Volume: 13 Issue: 4, 1137 - 1157, 01.12.2025
https://doi.org/10.36306/konjes.1679144

Abstract

In this study, the hyperparameters of Convolutional Neural Networks (CNNs) have been optimized with the newly proposed Starfish Optimization Algorithm (SFOA) in recent years. CNN has complex hyperparameters due to its structure. In the literature, the values of hyperparameters are mostly tried to be determined with combinatorial methods. The success of metaheuristic algorithms in optimizing the variables of different problems has inspired this study. Thus, four different numbers of channel values (8, 16, 32, and 64), five different kernel size values (1×1, 3×3, 5×5, 7×7, and 9×9), four different batch size values (32, 64, 128, and 256), twenty different values randomly generated between 0 and 0.05 for the learning rate, three different optimizer types (sgdm, adam, and rmsprop), and four different epoch values (5, 10, 15, and 20), which are the most critical hyperparameters in CNN, have been determined. A 6-dimensional solution space was determined with SFOA, and these hyperparameter values were placed in discretely defined dimensions. SFOA tried to determine the most appropriate hyperparameter values for the CNN model in each iteration. In this study, two different image datasets (MNIST and Kuzushiji-MNIST) were selected for CNN classification. Due to the hyperparameter optimization carried out with the SFOA algorithm, an accuracy of 99.52% for the MNIST dataset and 97.91% for the Kuzushiji-MNIST dataset was achieved. Comparisons with existing literature demonstrate that the proposed model showcases successful and competitive performance. Finally, the proposed CNN models are evaluated on a different image dataset, EMNIST (Extended MNIST). EMNIST is a more comprehensive version of MNIST developed for classifying handwritten letters and numbers. The accuracy results on the EMNIST dataset were 88.65% (the proposed CNN model with similar hyperparameter settings as MNIST) and 88.73% (the proposed CNN model with similar hyperparameter settings as Kuzushiji-MNIST), respectively. Additionally, hyperparameters for the EMNIST dataset were determined using SFOA, achieving an accuracy of 88.71%. Analyzing the hyperparameters of three different CNN models, it was observed that similar optimizer types, epoch numbers, kernel sizes, and channel numbers were preferred. This demonstrates that SFOA can produce reliable and effective settings across different datasets.

References

  • A. Gülcü, Z. Kuş, “Konvolüsyonel sinir ağlarında hiper-parametre optimizasyonu yöntemlerinin incelenmesi,” Gazi University Journal of Science Part C: Design and Technology, 7(2), 503-522, 2019.
  • E. Öztemel, Yapay Sinir Ağları. Istanbul: Papatya Yayincilik, 2003.
  • J. Bergstra, R. Bardenet, Y. Bengio, B. Kégl, (2011). “Algorithms for hyper-parameter optimization,” in Proceedings of the 24th International Conference on Neural Information Processing Systems NIPS 2011, 2546–2554.
  • M. Lindauer, K. Eggensperger, M. Feurer, A. Biedenkapp, D. Deng, C. Benjamins, T. Ruhkopf, R. Sass, and F. Hutter, “SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization,” The Journal of Machine Learning Research., 23(1), 2022.
  • L. Li, K. Jamieson, A. Rostamizadeh, and A. Talwalkar, “Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization,” Journal of Machine Learning Research, 18(1), 6765–6816, 2017.
  • O.M. Khanday, S. Dadvandipour, M.A. Lone, “Effect of filter sizes on image classification in CNN: A case study on CFIR10 and fashion-MNIST datasets,” IAES International Journal of Artificial Intelligence, 10 (4), 872–878, 2021.
  • H. Shao, E. Ma, M. Zhu, X. Deng, S. Zhai, “MNIST Handwritten Digit Classification Based on Convolutional Neural Network with Hyperparameter Optimization,” Intelligent Automation & Soft Computing, 36(3), 2023.
  • B. Bischl, M. Binder, M. Lang, T. Pielok, J. Richter, S. Coors, J. Thomas, T. Ullmann, M. Becker, A. Boulesteix, D. Deng, M. Lindauer, “Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges,” Wiley Interdiscip. Rev. Data Min. Knowl. Discov., vol. 13, 2023.
  • T. Yigit, S. Atmaca, R. Gurfidan, R. Colak, “ Hyper Parameter Analysis in Recognition of Handwritten Digits Using Convolutional Neural Network,” Gazi Journal of Engineering Sciences, 9(4), 268-277, 2017
  • H. Orhan, D. S. Polat, H. Hakli, H. “Farkli Çaprazlama Teknikleri Kullanan Dağinik Arama Algoritmasi Ile Evrişimli Sinir Ağlarinda Hiper Parametre Optimizasyonu,” Kahramanmaraş Sütçü İmam Üniversitesi Mühendislik Bilimleri Dergisi, 27(4), 1437-1450, 2024.
  • Xiao, X., Yan, M., Basodi, S., Ji, C., Pan, Y. (2020). Efficient hyperparameter optimization in deep learning using a variable length genetic algorithm. arXiv preprint arXiv:2006.12703.
  • E. Tanyıldızı, F. Demirtaş, (2019). “Hiper Parametre Optimizasyonu Hyper Parameter Optimization,” Paper presented at the 2019 1st International Informatics and Software Engineering Conference (UBMYK).
  • E. Özbay, F. A. Özbay, “Parçacık Sürüsü Optimizasyon Algoritması ile Optimize Edilmiş Evrişimsel Sinir Ağı Kullanılarak Dermoskopik Görüntülerden Cilt Kanserinin Sınıflandırılması,” Fırat Üniversitesi Mühendislik Bilimleri Dergisi, 35(1), 261-273, 2023.
  • Ö. İnik, “CNN hyper-parameter optimization for environmental sound classification,” Applied Acoustics, 202, 109168, 2023.
  • M.E. Kıymaç, (2022). “Hyper-parameter optimization of deep neural networks with metaheuristic algorithms,” Yüksek Lisans Tezi, Alparslan Türkeş Bilim ve Teknoloji Üniversitesi.
  • M. Yurdakul, (2022). “Meta-sezgisel algoritmalar ile konvolüsyonel sinir ağı mimarisinin hiper parametrelerinin optimizasyonu,” Yüksek Lisans Tezi, Selçuk Üniversitesi.
  • G. Atteia, N. Abdel Samee, E. S. M. El-Kenawy, A. Ibrahim, “CNN-hyperparameter optimization for diabetic maculopathy diagnosis in optical coherence tomography and fundus retinography,” Mathematics, 10(18), 3274, 2022.
  • M. Ait Amou, K. Xia, S. Kamhi, M. Mouhafid, “A novel MRI diagnosis method for brain tumor classification based on CNN and Bayesian Optimization,” In Healthcare, 10(3), 494, 2022.
  • H. ZainEldin, S. A. Gamel, E. S. M. El-Kenawy, A. H. Alharbi, D. S. Khafaga, A. Ibrahim, F. M. Talaat, “Brain tumor detection and classification using deep learning and sine-cosine fitness grey wolf optimization,” Bioengineering, 10(1), 18, 2022.
  • M. Sethi, S. Ahuja, S. Rani, P. Bawa, A. Zaguia, A. “Classification of Alzheimer’s Disease Using Gaussian‐Based Bayesian Parameter Optimization for Deep Convolutional LSTM Network,” Computational and Mathematical Methods in Medicine, 2021(1), 4186666, 2021.
  • M. Zhang, H. Li, S. Pan, J. Lyu, S. Ling, S., Su, “Convolutional neural networks-based lung nodule classification: A surrogate-assisted evolutionary algorithm for hyperparameter optimization,” IEEE Transactions on Evolutionary Computation, 25(5), 869-882, 2021.
  • K. Shankar, Y. Zhang, Y., Liu, L. Wu, C. H. Chen, “Hyperparameter tuning deep learning for diabetic retinopathy fundus image classification,” IEEE Access, 8, 118164-118173, 2020.
  • L. M. Seng, B. Bang Chen Chiang, Z. Arabee Abdul Salam, G. Yih Tan, H. Tong Chai, “MNIST handwritten digit recognition with different CNN architectures,” Journal of Applied Technology and Innovation, 5(1), 7-10, 2021.
  • C. Zhong, G. Li, Z. Meng, H. Li, A. R. Yildiz, S. Mirjalili, S. “Starfish optimization algorithm (SFOA): a bio-inspired metaheuristic algorithm for global optimization compared with 100 optimizers,” Neural Computing and Applications, 37(5), 3641-3683, 2025.
  • D. Izci, S. Ekinci, M. Jabari, M. Bajaj, V. Blazek, L. Prokop, ... & S. Mirjalili, “A new intelligent control strategy for CSTH temperature regulation based on the starfish optimization algorithm,” Scientific Reports, 15(1), 12327, 2025.
  • S. S. Kadam, A. C. Adamuthe, A. B. Patil, “CNN model for image classification on MNIST and fashion-MNIST dataset,” Journal of scientific research, 64(2), 374-384, 2020.
  • S. Ahlawat, A. Choudhary, A. Nayyar, S. Singh, & B. Yoon, “Improved handwritten digit recognition using convolutional neural networks (CNN),” Sensors, 20(12), 3344, 2020.
  • A. Jain & B. K. Sharma, “Analysis of activation functions for Convolutional Neural Network based mnist handwritten character recognition,” International journal of advanced studies of scientific research, 3(9), 2018.
  • A. A. Yahya, J. Tan, & M. Hu, “A novel handwritten digit classification system based on convolutional neural network approach,” Sensors, 21(18), 6273, 2021.
  • G. Gilanie, H. Shafiq, S. N. Batool, S. N. Abbas, H. Shafique, S. Cheema, ... & M. Ahsan, “PARAMETER OPTIMIZATION OF AUTOENCODER FOR IMAGE CLASSIFICATION USING GENETIC ALGORITHM,” Spectrum of Engineering Sciences, 3(4), 201-213, 2025.
  • S. Kiliçarslan, S. “PSO+ GWO: a hybrid particle swarm optimization and Grey Wolf optimization based Algorithm for fine-tuning hyper-parameters of convolutional neural networks for Cardiovascular Disease Detection,” Journal of Ambient Intelligence and Humanized Computing, 14(1), 87-97, 2023.
  • B.S. Park, S. Lee, Y.H. Seo, “Training Method for Enhancing Classification Accuracy of Kuzushiji-MNIST/49 using Deep Learning based on CNN,” Journal of the Korea Institute of Information and Communication Engineering, 24(3), 355-363, 2020.
  • S. Zhao, Q. Fan, Q. Dong, Z. Xing, X. Yang, X. He, “Efficient construction and convergence analysis of sparse convolutional neural networks,” Neurocomputing, 597, 128032, 2024.
  • B. T. Kettle and J. S. Lucas, “Biometric relationships between organ indices, fecundity, oxygen consumption and body size in Acanthaster planci (L.) (Echinodermata; Asteroidea),” Bulletin of Marine Science, 41(2), 541–551, 1987.
  • P. O. Ottesen and J. S. Lucas, “Divide or broadcast: interrelation of asexual and sexual reproduction in a population of the fissiparous hermaphroditic seastar Nepanthia belcheri (Asteroidea: Asterinidae),” Marine Biology, 69, 223–233, 1982.
  • M. Güler and A. Lök, “Foragig behaviors of sea stars, Marthasterias glacialis and Astropecten aranciacus (Asteroidea) and predator-prey interactions with warty venus clam, Venus verrucose (Bivalvia),” Journal of Experimental Marine Biology and Ecology, 465, 99–106, 2015.
  • Y. Hayashi and T. Motokawa, “Effects of lonic environment on viscosity of catch connective tissue in Holothurian body wall,” Journal of experimental biology, 125(1), 71–84, 1986.
  • Y. LeCun, B. E. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. E. Hubbard, L. D. Jackel, “Handwritten digit recognition with a back-propagation network,” Advances in neural information processing systems, 2, 1990.
  • Y. LeCun, L. Bottou, Y. Bengio, P. Haffner, “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, 86 (11), 2278-2324, 1998.
  • I. Goodfellow, Y. Bengio, A. Courville, Y. Bengio. Deep learning (Vol. 1). Cambridge: MIT press, 2016.
  • M. Tan, C. Emeksiz, “Hydrogen fuel cell parameter estimation using an innovative hybrid estimation model based on deep learning and probability pooling,” International Journal of Hydrogen Energy, 110, 445-456, 2024.
  • M. Braik, “Enhanced ali baba and the forty thieves algorithm for feature selection,” Neural Computing and Applications, 35(8), 6153–6184, 2023.
  • D. Albashish, A. I. Hammouri, M. Braik, J. Atwan, S. Sah ran, “Binary biogeography-based optimization based SVM-RFE for feature selection,” Applied Soft Computing, 101, 107026–107045, 2021.
  • E. A. Mohamed, M. S. Braik, M. A. Al-Betar, M. A. Awadallah, “Boosted spider wasp optimizer for high-dimensional feature selection,” Journal of Bionic Engineering, 21(5), 2424-2459, 2024.
  • J. C. Geng, Z. Cui, X. S. Gu, “Scatter search based particle swarm optimization algorithm for earliness/tardiness flowshop scheduling with uncertainty,” International Journal of Automation and Computing, 13(3), 285-295, 2016.
  • R. B. Wang, R. B. Hu, F. D. Geng, L. Xu, S. C. Chu, J. S. Pan, ... & S. Mirjalili, “The Animated Oat Optimization Algorithm: A nature-inspired metaheuristic for engineering optimization and a case study on Wireless Sensor Networks,” Knowledge-Based Systems, 113589, 2025.
  • A. Kumar, I. Singh, B. Singh. Classification of MNIST Dataset Using Different CNN Architecture. In Intelligent Circuits and Systems for SDG 3–Good Health and well-being (pp. 273-280). CRC Press, 2024.
  • F. M. Talaat and S.A. Gamel, “RL based hyper-parameters optimization algorithm (ROA) for convolutional neural network,” Journal of Ambient Intelligence and Humanized Computing, 14(10), 13349-13359, 2023.
  • U. Erkan, A. Toktas, D. Ustun, “Hyperparameter optimization of deep CNN classifier for plant species identification using artificial bee colony algorithm,” Journal of Ambient Intelligence and Humanized Computing, 14(7), 8827-8838, 2023.
  • L. C. Ribeiro, G. H. D. Rosa, D. Rodrigues, J. P. Papa, “Convolutional neural networks ensembles through single-iteration optimization,” Soft Computing, 26(8), 3871-3882, 2022.
  • A. Ghosh, A., Mukherjee, C. Ghosh, (2020). “Simplistic deep learning for Japanese handwritten digit recognition,” In Intelligent Techniques and Applications in Science and Technology: Proceedings of the First International Conference on Innovations in Modern Science and Technology 1, (pp. 87-93). Springer International Publishing.
  • H. Qiu, and J. Dong, “A robust residual shrinkage balanced network for image recognition from Japanese historical documents,” Journal of Sensors, 2023(1), 8316638, 2023.
  • T. Erkoç, M. T. Eskıl, “A novel similarity based unsupervised technique for training convolutional filters,” IEEE Access, 11, 49393-49408, 2023.
  • C. A. da Silva, D. C. Rosa, P. B. Miranda, F. R. Cordeiro, T. Si, A. C. Nascimento, ... & P. S de Mattos Neto, “A novel multi-objective grammar-based framework for the generation of convolutional neural networks,” Expert Systems With Applications, 212, 118670, 2023.
There are 54 citations in total.

Details

Primary Language English
Subjects Control Engineering, Mechatronics and Robotics (Other)
Journal Section Research Article
Authors

Gülnur Yıldızdan 0000-0001-6252-9012

Emine Baş 0000-0003-4322-6010

Publication Date December 1, 2025
Submission Date April 18, 2025
Acceptance Date July 25, 2025
Published in Issue Year 2025 Volume: 13 Issue: 4

Cite

IEEE G. Yıldızdan and E. Baş, “THE STUDY OF THE HYPERPARAMETER OPTIMIZATION ANALYSIS IN THE CONVOLUTIONAL NEURAL NETWORKS MODEL WITH STARFISH OPTIMIZATION ALGORITHM”, KONJES, vol. 13, no. 4, pp. 1137–1157, 2025, doi: 10.36306/konjes.1679144.