Research Article
BibTex RIS Cite
Year 2024, Volume: 8 Issue: 2, 52 - 58

Abstract

References

  • [1] Ö. Dolma, “COVID-19 and Non-COVID-19 Classification from Lung CT-Scan Images Using Deep Convolutional Neural Networks,” Int. J. Multidiscip. Stud. Innov. Technol., vol. 7, no. 2, p. 53, 2023, doi: 10.36287/ijmsit.7.2.3.
  • [2] E. Avuçlu, “Examining The Effect of Pre-processed Covid-19 Images On Classification Performance Using Deep Learning Method,” Int. Sci. Vocat. Stud. J., vol. 7, no. 2, pp. 94–102, Dec. 2023, doi: 10.47897/bilmes.1359954.
  • [3] E. Avuçlu, “Classification of Pistachio Images Using VGG16 and VGG19 Deep Learning Models,” Int. Sci. Vocat. Stud. J., vol. 7, no. 2, pp. 79–86, Dec. 2023, doi: 10.47897/bilmes.1328313.
  • [4] M. C. Bıngol and G. Bilgin, “Prediction of Chicken Diseases by Transfer Learning Method,” Int. Sci. Vocat. Stud. J., vol. 7, no. 2, pp. 170–175, Dec. 2023, doi: 10.47897/bilmes.1396890.
  • [5] Y. Durgun, "Classification of Starch Adulteration in Milk Using Spectroscopic Data and Machine Learning," Int. J. Eng. Res. Dev., vol. 16, no. 1, pp. 221-226, 2024, doi: 10.29137/umagd.1379171.
  • [6] A. Williams, N. Walton, A. Maryanski, S. Bogetic, W. Hines, and V. Sobes, “Stochastic gradient descent for optimization for nuclear systems,” Sci. Rep., vol. 13, no. 1, p. 8474, May 2023, doi: 10.1038/s41598-023-32112-7.
  • [7] S. Nagendram et al., “Stochastic gradient descent optimisation for convolutional neural network for medical image segmentation,” Open Life Sci., vol. 18, no. 1, Aug. 2023, doi: 10.1515/biol-2022-0665.
  • [8] C. Song, A. Pons, and K. Yen, “AG-SGD: Angle-Based Stochastic Gradient Descent,” IEEE Access, vol. 9, pp. 23007–23024, 2021, doi: 10.1109/ACCESS.2021.3055993.
  • [9] C. Milovic et al., “Comparison of parameter optimization methods for quantitative susceptibility mapping,” Magn. Reson. Med., vol. 85, no. 1, pp. 480–494, Jan. 2021, doi: 10.1002/mrm.28435.
  • [10] M. Reyad, A. M. Sarhan, and M. Arafa, “A modified Adam algorithm for deep neural network optimization,” Neural Comput. Appl., vol. 35, no. 23, pp. 17095–17112, 2023, doi: 10.1007/s00521-023-08568-z.
  • [11] I. K. M. Jais, A. R. Ismail, and S. Q. Nisa, “Adam Optimization Algorithm for Wide and Deep Neural Network,” Knowl. Eng. Data Sci., vol. 2, no. 1, p. 41, 2019, doi: 10.17977/um018v2i12019p41-46.
  • [12] B. Cortiñas-Lorenzo and F. Pérez-González, “Adam and the Ants: On the Influence of the Optimization Algorithm on the Detectability of DNN Watermarks,” Entropy, vol. 22, no. 12, p. 1379, Dec. 2020, doi: 10.3390/e22121379.
  • [13] P. Ramachandran, T. Eswarlal, M. Lehman, and Z. Colbert, “Assessment of optimizers and their performance in autosegmenting lung tumors,” J. Med. Phys., vol. 48, no. 2, pp. 129–135, 2023, doi: 10.4103/jmp.jmp_54_23.
  • [14] P. Podder et al., “LDDNet: A Deep Learning Framework for the Diagnosis of Infectious Lung Diseases,” Sensors, vol. 23, no. 1, 2023, doi: 10.3390/s23010480.
  • [15] C. Annamalai, C. Vijayakumaran, V. Ponnusamy, and H. Kim, “Optimal ElGamal Encryption with Hybrid Deep-Learning-Based Classification on Secure Internet of Things Environment,” Sensors, vol. 23, no. 12, p. 5596, Jun. 2023, doi: 10.3390/s23125596.
  • [16] R. Elshamy, O. Abu-Elnasr, M. Elhoseny, and S. Elmougy, “Improving the efficiency of RMSProp optimizer by utilizing Nestrove in deep learning,” Sci. Rep., vol. 13, no. 1, p. 8814, May 2023, doi: 10.1038/s41598-023-35663-x.
  • [17] X. Jiang, B. Hu, S. Chandra Satapathy, S. H. Wang, and Y. D. Zhang, “Fingerspelling Identification for Chinese Sign Language via AlexNet-Based Transfer Learning and Adam Optimizer,” Sci. Program., vol. 2020, 2020, doi: 10.1155/2020/3291426.
  • [18] A. Daneshvar, M. Ebrahimi, F. Salahi, M. Rahmaty, and M. Homayounfar, “Brent Crude Oil Price Forecast Utilizing Deep Neural Network Architectures,” Comput. Intell. Neurosci., vol. 2022, pp. 1–13, May 2022, doi: 10.1155/2022/6140796.
  • [19] V. Ojha and G. Nicosia, “Backpropagation Neural Tree,” Neural Networks, vol. 149, pp. 66–83, May 2022, doi: 10.1016/j.neunet.2022.02.003.
  • [20] B. Zhu, Y. Shi, J. Hao, and G. Fu, “Prediction of Coal Mine Pressure Hazard Based on Logistic Regression and Adagrad Algorithm—A Case Study of C Coal Mine,” Appl. Sci., vol. 13, no. 22, 2023, doi: 10.3390/app132212227.
  • [21] F. Aamir, I. Aslam, M. Arshad, and H. Omer, “Accelerated Diffusion-Weighted MR Image Reconstruction Using Deep Neural Networks,” J. Digit. Imaging, vol. 36, no. 1, pp. 276–288, Nov. 2022, doi: 10.1007/s10278-022-00709-5.
  • [22] G. Ayana, J. Park, J.-W. Jeong, and S. Choe, “A Novel Multistage Transfer Learning for Ultrasound Breast Cancer Image Classification,” Diagnostics, vol. 12, no. 1, p. 135, Jan. 2022, doi: 10.3390/diagnostics12010135.
  • [23] R. Sirisha, N. Anjum, and K. Vaidehi, “INDIAN JOURNAL OF SCIENCE AND TECHNOLOGY Implementation of CNN and ANN for Fashion-MNIST-Dataset using Different Optimizers,” Indian J. Sci. Technol., vol. 15, no. 47, pp. 2639–2645, 2022, [Online]. Available: https://www.indjst.org/
  • [24] A. S. Henrique et al., “Classifying Garments from Fashion-MNIST Dataset Through CNNs,” Adv. Sci. Technol. Eng. Syst. J., vol. 6, no. 1, pp. 989–994, 2021, doi: 10.25046/aj0601109.
  • [25] O. M. Khanday, S. Dadvandipour, and M. A. Lone, “Effect of filter sizes on image classification in CNN: A case study on CFIR10 and fashion-MNIST datasets,” IAES Int. J. Artif. Intell., vol. 10, no. 4, pp. 872–878, 2021, doi: 10.11591/ijai.v10.i4.pp872-878.
  • [26] Y. Tang, H. Cui, and S. Liu, “Optimal Design of Deep Residual Network Based on Image Classification of Fashion-MNIST Dataset,” J. Phys. Conf. Ser., vol. 1624, no. 5, pp. 0–7, 2020, doi: 10.1088/1742-6596/1624/5/052011.
  • [27] M. Kayed, A. Anter, and H. Mohamed, “Classification of Garments from Fashion MNIST Dataset Using CNN LeNet-5 Architecture,” Proc. 2020 Int. Conf. Innov. Trends Commun. Comput. Eng. ITCE 2020, no. June, pp. 238–243, 2020, doi: 10.1109/ITCE48509.2020.9047776.
  • [28] H. H. Zhu et al., “Space-efficient optical computing with an integrated chip diffractive neural network,” Nat. Commun., vol. 13, no. 1, pp. 1–9, 2022, doi: 10.1038/s41467-022-28702-0.
  • [29] T. Hur, L. Kim, and D. K. Park, “Quantum convolutional neural network for classical data classification,” Quantum Mach. Intell., vol. 4, no. 1, pp. 1–18, 2022, doi: 10.1007/s42484-021-00061-x.
  • [30] O. Nocentini, J. Kim, M. Z. Bashir, and F. Cavallo, “Image Classification Using Multiple Convolutional Neural Networks on the Fashion-MNIST Dataset,” Sensors, vol. 22, no. 23, p. 9544, Dec. 2022, doi: 10.3390/s22239544.
  • [31] S. Yang, S. Hoque, and F. Deravi, “Adaptive Template Reconstruction for Effective Pattern Classification,” Sensors, vol. 23, no. 15, p. 6707, Jul. 2023, doi: 10.3390/s23156707.
  • [32] S. Coleman, D. Kerr, and Y. Zhang, “Image Sensing and Processing with Convolutional Neural Networks,” Sensors, vol. 22, no. 10, p. 3612, May 2022, doi: 10.3390/s22103612.
  • [33] V. Terziyan, D. Malyk, M. Golovianko, and V. Branytskyi, “Hyper-flexible Convolutional Neural Networks based on Generalized Lehmer and Power Means,” Neural Networks, vol. 155, pp. 177–203, Nov. 2022, doi: 10.1016/j.neunet.2022.08.017.
  • [34] K. Wang, C. Xu, G. Li, Y. Zhang, Y. Zheng, and C. Sun, “Combining convolutional neural networks and self-attention for fundus diseases identification,” Sci. Rep., vol. 13, no. 1, p. 76, Jan. 2023, doi: 10.1038/s41598-022-27358-6.
  • [35] E. Chu, D. Li, and Y. Tong, “Optimized federated learning based on Adagrad algorithm and algorithm optimization,” Appl. Comput. Eng., vol. 19, no. 1, pp. 9–17, Oct. 2023, doi: 10.54254/2755-2721/19/20231000.
  • [36] I. Naseer, S. Akram, T. Masood, A. Jaffar, M. A. Khan, and A. Mosavi, “Performance Analysis of State-of-the-Art CNN Architectures for LUNA16,” Sensors, vol. 22, no. 12, p. 4426, Jun. 2022, doi: 10.3390/s22124426.
  • [37] Y. S. Saboo, S. Kapse, and P. Prasanna, “Convolutional Neural Networks (CNNs) for Pneumonia Classification on Pediatric Chest Radiographs,” Cureus, Aug. 2023, doi: 10.7759/cureus.44130.
  • [38] M. Uppal et al., “Enhancing accuracy in brain stroke detection: Multi-layer perceptron with Adadelta, RMSProp and AdaMax optimizers,” Front. Bioeng. Biotechnol., vol. 11, Sep. 2023, doi: 10.3389/fbioe.2023.1257591.
  • [39] R. Liang, X. Chang, P. Jia, and C. Xu, “Mine Gas Concentration Forecasting Model Based on an Optimized BiGRU Network,” ACS Omega, vol. 5, no. 44, pp. 28579–28586, Nov. 2020, doi: 10.1021/acsomega.0c03417
  • [40] S. B. ud din Tahir, A. Jalal, and K. Kim, “Wearable Inertial Sensors for Daily Activity Analysis Based on Adam Optimization and the Maximum Entropy Markov Model,” Entropy, vol. 22, no. 5, p. 579, May 2020, doi: 10.3390/e22050579.

Comparison of Different Optimization Algorithms in the Fashion MNIST Dataset

Year 2024, Volume: 8 Issue: 2, 52 - 58

Abstract

This study examines the effects of various optimization algorithms used in deep learning models to classify fashion-oriented clothing items. The Fashion MNIST dataset has been chosen as a rich data source. Models developed using Convolutional Neural Networks (CNN) have been trained with various optimization algorithms such as Nadam, Adadelta, Adamax, Adam, Adagrad, SGD, and RMSprop. Understanding the impact of these algorithms on the model's performance during the training process forms the basis of the study. The findings of the research reveal that optimization algorithms have a significant effect on the accuracy rates of the model. While the Nadam and Adadelta algorithms achieved the highest accuracy rates, the RMSprop algorithm displayed relatively lower performance. These results indicate that different optimization techniques can significantly influence the performance of deep learning-based classification systems.

References

  • [1] Ö. Dolma, “COVID-19 and Non-COVID-19 Classification from Lung CT-Scan Images Using Deep Convolutional Neural Networks,” Int. J. Multidiscip. Stud. Innov. Technol., vol. 7, no. 2, p. 53, 2023, doi: 10.36287/ijmsit.7.2.3.
  • [2] E. Avuçlu, “Examining The Effect of Pre-processed Covid-19 Images On Classification Performance Using Deep Learning Method,” Int. Sci. Vocat. Stud. J., vol. 7, no. 2, pp. 94–102, Dec. 2023, doi: 10.47897/bilmes.1359954.
  • [3] E. Avuçlu, “Classification of Pistachio Images Using VGG16 and VGG19 Deep Learning Models,” Int. Sci. Vocat. Stud. J., vol. 7, no. 2, pp. 79–86, Dec. 2023, doi: 10.47897/bilmes.1328313.
  • [4] M. C. Bıngol and G. Bilgin, “Prediction of Chicken Diseases by Transfer Learning Method,” Int. Sci. Vocat. Stud. J., vol. 7, no. 2, pp. 170–175, Dec. 2023, doi: 10.47897/bilmes.1396890.
  • [5] Y. Durgun, "Classification of Starch Adulteration in Milk Using Spectroscopic Data and Machine Learning," Int. J. Eng. Res. Dev., vol. 16, no. 1, pp. 221-226, 2024, doi: 10.29137/umagd.1379171.
  • [6] A. Williams, N. Walton, A. Maryanski, S. Bogetic, W. Hines, and V. Sobes, “Stochastic gradient descent for optimization for nuclear systems,” Sci. Rep., vol. 13, no. 1, p. 8474, May 2023, doi: 10.1038/s41598-023-32112-7.
  • [7] S. Nagendram et al., “Stochastic gradient descent optimisation for convolutional neural network for medical image segmentation,” Open Life Sci., vol. 18, no. 1, Aug. 2023, doi: 10.1515/biol-2022-0665.
  • [8] C. Song, A. Pons, and K. Yen, “AG-SGD: Angle-Based Stochastic Gradient Descent,” IEEE Access, vol. 9, pp. 23007–23024, 2021, doi: 10.1109/ACCESS.2021.3055993.
  • [9] C. Milovic et al., “Comparison of parameter optimization methods for quantitative susceptibility mapping,” Magn. Reson. Med., vol. 85, no. 1, pp. 480–494, Jan. 2021, doi: 10.1002/mrm.28435.
  • [10] M. Reyad, A. M. Sarhan, and M. Arafa, “A modified Adam algorithm for deep neural network optimization,” Neural Comput. Appl., vol. 35, no. 23, pp. 17095–17112, 2023, doi: 10.1007/s00521-023-08568-z.
  • [11] I. K. M. Jais, A. R. Ismail, and S. Q. Nisa, “Adam Optimization Algorithm for Wide and Deep Neural Network,” Knowl. Eng. Data Sci., vol. 2, no. 1, p. 41, 2019, doi: 10.17977/um018v2i12019p41-46.
  • [12] B. Cortiñas-Lorenzo and F. Pérez-González, “Adam and the Ants: On the Influence of the Optimization Algorithm on the Detectability of DNN Watermarks,” Entropy, vol. 22, no. 12, p. 1379, Dec. 2020, doi: 10.3390/e22121379.
  • [13] P. Ramachandran, T. Eswarlal, M. Lehman, and Z. Colbert, “Assessment of optimizers and their performance in autosegmenting lung tumors,” J. Med. Phys., vol. 48, no. 2, pp. 129–135, 2023, doi: 10.4103/jmp.jmp_54_23.
  • [14] P. Podder et al., “LDDNet: A Deep Learning Framework for the Diagnosis of Infectious Lung Diseases,” Sensors, vol. 23, no. 1, 2023, doi: 10.3390/s23010480.
  • [15] C. Annamalai, C. Vijayakumaran, V. Ponnusamy, and H. Kim, “Optimal ElGamal Encryption with Hybrid Deep-Learning-Based Classification on Secure Internet of Things Environment,” Sensors, vol. 23, no. 12, p. 5596, Jun. 2023, doi: 10.3390/s23125596.
  • [16] R. Elshamy, O. Abu-Elnasr, M. Elhoseny, and S. Elmougy, “Improving the efficiency of RMSProp optimizer by utilizing Nestrove in deep learning,” Sci. Rep., vol. 13, no. 1, p. 8814, May 2023, doi: 10.1038/s41598-023-35663-x.
  • [17] X. Jiang, B. Hu, S. Chandra Satapathy, S. H. Wang, and Y. D. Zhang, “Fingerspelling Identification for Chinese Sign Language via AlexNet-Based Transfer Learning and Adam Optimizer,” Sci. Program., vol. 2020, 2020, doi: 10.1155/2020/3291426.
  • [18] A. Daneshvar, M. Ebrahimi, F. Salahi, M. Rahmaty, and M. Homayounfar, “Brent Crude Oil Price Forecast Utilizing Deep Neural Network Architectures,” Comput. Intell. Neurosci., vol. 2022, pp. 1–13, May 2022, doi: 10.1155/2022/6140796.
  • [19] V. Ojha and G. Nicosia, “Backpropagation Neural Tree,” Neural Networks, vol. 149, pp. 66–83, May 2022, doi: 10.1016/j.neunet.2022.02.003.
  • [20] B. Zhu, Y. Shi, J. Hao, and G. Fu, “Prediction of Coal Mine Pressure Hazard Based on Logistic Regression and Adagrad Algorithm—A Case Study of C Coal Mine,” Appl. Sci., vol. 13, no. 22, 2023, doi: 10.3390/app132212227.
  • [21] F. Aamir, I. Aslam, M. Arshad, and H. Omer, “Accelerated Diffusion-Weighted MR Image Reconstruction Using Deep Neural Networks,” J. Digit. Imaging, vol. 36, no. 1, pp. 276–288, Nov. 2022, doi: 10.1007/s10278-022-00709-5.
  • [22] G. Ayana, J. Park, J.-W. Jeong, and S. Choe, “A Novel Multistage Transfer Learning for Ultrasound Breast Cancer Image Classification,” Diagnostics, vol. 12, no. 1, p. 135, Jan. 2022, doi: 10.3390/diagnostics12010135.
  • [23] R. Sirisha, N. Anjum, and K. Vaidehi, “INDIAN JOURNAL OF SCIENCE AND TECHNOLOGY Implementation of CNN and ANN for Fashion-MNIST-Dataset using Different Optimizers,” Indian J. Sci. Technol., vol. 15, no. 47, pp. 2639–2645, 2022, [Online]. Available: https://www.indjst.org/
  • [24] A. S. Henrique et al., “Classifying Garments from Fashion-MNIST Dataset Through CNNs,” Adv. Sci. Technol. Eng. Syst. J., vol. 6, no. 1, pp. 989–994, 2021, doi: 10.25046/aj0601109.
  • [25] O. M. Khanday, S. Dadvandipour, and M. A. Lone, “Effect of filter sizes on image classification in CNN: A case study on CFIR10 and fashion-MNIST datasets,” IAES Int. J. Artif. Intell., vol. 10, no. 4, pp. 872–878, 2021, doi: 10.11591/ijai.v10.i4.pp872-878.
  • [26] Y. Tang, H. Cui, and S. Liu, “Optimal Design of Deep Residual Network Based on Image Classification of Fashion-MNIST Dataset,” J. Phys. Conf. Ser., vol. 1624, no. 5, pp. 0–7, 2020, doi: 10.1088/1742-6596/1624/5/052011.
  • [27] M. Kayed, A. Anter, and H. Mohamed, “Classification of Garments from Fashion MNIST Dataset Using CNN LeNet-5 Architecture,” Proc. 2020 Int. Conf. Innov. Trends Commun. Comput. Eng. ITCE 2020, no. June, pp. 238–243, 2020, doi: 10.1109/ITCE48509.2020.9047776.
  • [28] H. H. Zhu et al., “Space-efficient optical computing with an integrated chip diffractive neural network,” Nat. Commun., vol. 13, no. 1, pp. 1–9, 2022, doi: 10.1038/s41467-022-28702-0.
  • [29] T. Hur, L. Kim, and D. K. Park, “Quantum convolutional neural network for classical data classification,” Quantum Mach. Intell., vol. 4, no. 1, pp. 1–18, 2022, doi: 10.1007/s42484-021-00061-x.
  • [30] O. Nocentini, J. Kim, M. Z. Bashir, and F. Cavallo, “Image Classification Using Multiple Convolutional Neural Networks on the Fashion-MNIST Dataset,” Sensors, vol. 22, no. 23, p. 9544, Dec. 2022, doi: 10.3390/s22239544.
  • [31] S. Yang, S. Hoque, and F. Deravi, “Adaptive Template Reconstruction for Effective Pattern Classification,” Sensors, vol. 23, no. 15, p. 6707, Jul. 2023, doi: 10.3390/s23156707.
  • [32] S. Coleman, D. Kerr, and Y. Zhang, “Image Sensing and Processing with Convolutional Neural Networks,” Sensors, vol. 22, no. 10, p. 3612, May 2022, doi: 10.3390/s22103612.
  • [33] V. Terziyan, D. Malyk, M. Golovianko, and V. Branytskyi, “Hyper-flexible Convolutional Neural Networks based on Generalized Lehmer and Power Means,” Neural Networks, vol. 155, pp. 177–203, Nov. 2022, doi: 10.1016/j.neunet.2022.08.017.
  • [34] K. Wang, C. Xu, G. Li, Y. Zhang, Y. Zheng, and C. Sun, “Combining convolutional neural networks and self-attention for fundus diseases identification,” Sci. Rep., vol. 13, no. 1, p. 76, Jan. 2023, doi: 10.1038/s41598-022-27358-6.
  • [35] E. Chu, D. Li, and Y. Tong, “Optimized federated learning based on Adagrad algorithm and algorithm optimization,” Appl. Comput. Eng., vol. 19, no. 1, pp. 9–17, Oct. 2023, doi: 10.54254/2755-2721/19/20231000.
  • [36] I. Naseer, S. Akram, T. Masood, A. Jaffar, M. A. Khan, and A. Mosavi, “Performance Analysis of State-of-the-Art CNN Architectures for LUNA16,” Sensors, vol. 22, no. 12, p. 4426, Jun. 2022, doi: 10.3390/s22124426.
  • [37] Y. S. Saboo, S. Kapse, and P. Prasanna, “Convolutional Neural Networks (CNNs) for Pneumonia Classification on Pediatric Chest Radiographs,” Cureus, Aug. 2023, doi: 10.7759/cureus.44130.
  • [38] M. Uppal et al., “Enhancing accuracy in brain stroke detection: Multi-layer perceptron with Adadelta, RMSProp and AdaMax optimizers,” Front. Bioeng. Biotechnol., vol. 11, Sep. 2023, doi: 10.3389/fbioe.2023.1257591.
  • [39] R. Liang, X. Chang, P. Jia, and C. Xu, “Mine Gas Concentration Forecasting Model Based on an Optimized BiGRU Network,” ACS Omega, vol. 5, no. 44, pp. 28579–28586, Nov. 2020, doi: 10.1021/acsomega.0c03417
  • [40] S. B. ud din Tahir, A. Jalal, and K. Kim, “Wearable Inertial Sensors for Daily Activity Analysis Based on Adam Optimization and the Maximum Entropy Markov Model,” Entropy, vol. 22, no. 5, p. 579, May 2020, doi: 10.3390/e22050579.
There are 40 citations in total.

Details

Primary Language English
Subjects Image Processing
Journal Section Articles
Authors

Umut Saray 0000-0003-3339-6876

Uğur Çavdar 0000-0002-3434-6670

Early Pub Date December 8, 2024
Publication Date
Submission Date July 3, 2024
Acceptance Date August 30, 2024
Published in Issue Year 2024 Volume: 8 Issue: 2

Cite

IEEE U. Saray and U. Çavdar, “Comparison of Different Optimization Algorithms in the Fashion MNIST Dataset”, IJMSIT, vol. 8, no. 2, pp. 52–58, 2024.