Effect on model performance of regularization methods
Abstract
Keywords
References
- [1] H. Akaike, “Information theory and an extension of the maximum likelihood principle,” in Selected Papers of Hirotugu Akaike. Berlin, Germany: Springer, 1998, pp. 199–213.
- [2] A. Krizhevsky, I. Sutskever, and G. E. Hinton. “Imagenet classification with deep convolutional neural networks”. In Advances in neural information processing systems, pp. 1097–1105, 2012
- [3] N. Srivastava, G. Hinton, A. Krizhevsky, L. Sutskever and R. Salakhutdinov. "Dropout: a simple way to prevent neural networks from overfitting." The journal of machine learning research 15.1 2014: pp1929-1958.
- [4] S. Ioffe and C. Szegedy. “Batch normalization: Accelerating deep network training by reducing internal covariate shift”. In Proceedings of the 32nd International Conference on Machine Learning (ICML), 2015. Pp 448-456
- [5] Y. Wu and K. He, “Group normalization,” in European Conference on Computer Vision (ECCV), 2018, pp. 3–19.
- [6] J. L. Ba, J. R. Kiros, and G. E Hinton. “Layer normalization”. arXiv preprint 2016, arXiv:1607.06450
- [7] D.Ulyanov, V. Andrea, and L. Victor . "Instance normalization: The missing ingredient for fast stylization." arXiv preprint, 2016 arXiv:1607.08022 .
- [8] L. Wan, M. Zeiler,, S. Zhang, Y.L. Cun, and R. Fergus,. “Regularization of neural networks using dropconnect”. In Proceedings of the 30th International Conference on Machine Learning (ICML-13),2013., pp. 1058–1066
Details
Primary Language
English
Subjects
-
Journal Section
Research Article
Authors
Cafer Budak
*
0000-0002-8470-4579
Türkiye
Vasfiye Mençik
This is me
0000-0002-3769-0071
Türkiye
Mehmet Emin Asker
0000-0003-4585-4168
Türkiye
Publication Date
December 31, 2021
Submission Date
November 16, 2021
Acceptance Date
-
Published in Issue
Year 2021 Volume: 12 Number: 5