Research Article
BibTex RIS Cite

The Effect of the Normalization Method Used in Different Sample Sizes on the Success of Artificial Neural Network Model

Year 2019, , 170 - 192, 15.07.2019
https://doi.org/10.21449/ijate.479404

Abstract

In this study, it was aimed to compare different
normalization methods employed in model developing process via artificial
neural networks with different sample sizes. As part of comparison of
normalization methods, input variables were set as: work discipline,
environmental awareness, instrumental motivation, science self-efficacy, and
weekly science learning time that have been covered in PISA 2015, whereas
students' Science Literacy level was defined as the output variable. The amount
of explained variance and the statistics about the correct classification
ratios were used in the comparison of the normalization methods discussed in
the study. The dataset was analyzed in Matlab2017b software and both prediction
and classification algorithms were used in the study.
According to the findings of the study, adjusted
min-max normalization method yielded better results in terms of the amount of
explained variance in different sample sizes compared to other normalization
methods; no significant difference was found in correct classification rates
according to the normalization method of the data, which lacked normal
distribution and the possibility of overfitting should be taken into
consideration when working with small samples in the modelling process of
artificial neural network. In addition, it was also found that sample size had
a significant effect on both classification and prediction analyzes performed
with artificial neural network methods. As a result of the study, it was
concluded that with a sample size over 1000, more consistent results can be
obtained in the studies performed with artificial neural networks in the field
of education.

References

  • Aksu, G., & Doğan, N. (2018). Veri Madenciliğinde Kullanılan Öğrenme Yöntemlerinin Farklı Koşullar Altında Karşılaştırılması, Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi, 51(3), 71-100.
  • Ali, A. & Senan, N. (2017). The Effect of Normalization in VIOLENCE Video Classification Performance. IOP Conf. Ser.: Mater. Sci. Eng. 226 012082.
  • Anderson, J. A. (1990). Data Representation in Neural Networks, AI Expert.
  • Ayalakshmi, T., & Santhakumaran, A. (2011). Statistical Normalization and Back Propagation for Classification. International Journal of Computer Theory and Engineering, 3(1), 1793-8201.
  • Azadeh, M., Sheikhalishahi, M., Tabesh, A., & Negahban (2011). The Effects of Pre-Processing Methods on Forecasting Improvement of Artificial Neural Networks, Australian Journal of Basic and Applied Sciences, 5(6), 570-580.
  • Azimi-Sadjadi, M.R. & Stricker, S.A. (1994). “Detection and Classification of Buried Dielectric Anomalies Using Neural Networks Further Results,” IEEE Trans. Instrumentations and Measurement, 43, pp. 34-39.
  • Bishop, C. M. (1995), Neural Networks for Pattern Recognition, Oxford: Oxford University Press.
  • Cihan, P., Kalıpsız, O., & Gökçe, E. (2017). Hayvan Hastalığını Teşhisinde Normalizasyon Tekniklerinin Yapay Sinir Ağı Performansına Etkisi [Effect of Normalization Techniques on Artificial Neural Network and Feature Selection Performance in Animal Disease Diagnosis]. e-Turkish Studies (elektronik), 12(11), 59-70, 2017.
  • Davydov, M.V., Osipov, A.N., Kilin, S.Y. & Kulchitsky, V.A. (2018). Neural Network Structures: Current and Future States. Open semantic technologies for intelligent systems, 259-264.
  • Dekking, F.M., Kraaikamp, C., Lopuhaä, H.P., & Meester, L.E. (2005). A modern introduction to probability and statistics: Understanding why and how. United States: Springer-Verlag London Limited.
  • Deveci, M. (2012). Yapay Sinir Ağları ve Bekleme Süresinin Tahmininde Kullanılması [Artificial Neural Networks and Used of Waiting Time Estimation]. Unpublished Master Dissertation, Gazi Üniversitesi Sosyal Bilimleri Enstitüsü, Ankara.
  • Elmas, Ç. (2003). Yapay Sinir Ağları, Birinci Baskı, Ankara: Seçkin Yayıncılık.
  • Famili, A., Shen, W., Weber, R., & Simoudis, E. (1997). Data Preprocessing and Intelligent Data Analysis. Intelligent Data Analysis, 1, 3-23.
  • Finch, J. F., West, S. G., & MacKinnon, D. P. (1997). Effects of sample size and nonnormality on the estimation of mediated effects in latent variable models. Structural Equation Modeling: A Multidisciplinary Journal, 4(2), 87-107.
  • Fraenkel, J.R., & Wallen, N.E. (2006). How to design and evaluate research in education (6th ed.). New York, NY: McGraw-Hill.
  • Gardner, M. W., & Dorling, S. R. (1998). Artificial Neural Networks (The Multilayer Perceptron) - A Review of Applications in the Atmospheric Sciences. Atmospheric Environment, 32, 2627-2636.
  • Gerasimovic, M., Stanojevic, L., Bugaric, U., Miljkovic, Z., & Veljovic, A. (2011). Using Artificial Neural Networks for Predictive Modeling of Graduates’ Professional Choice. The New Educational Review, 23, 175- 188.
  • Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning. MIT Press.
  • Gonzalez, J.M., & DesJardins, S.L. (2002). Artificial neural networks: A new approach to predicting application behaviour, Research in Higher Education, 43(2), 235–258
  • Gschwind, M. (2007). Predicting Late Payments: A Study in Tenant Behavior Using Data Mining Techniques. The Journal of Real Estate Portfolio Management, 13(3), 269-288.
  • Hagan, M.T., Demuth, H.B., Beale, M.H., & Jesus, O. (2014). Neural Network Design, Boston: PWS Publishing Co.
  • Hastie, T., Tibshirani, R., & Friedman, J. H. (2009). The elements of statistical learning: Data mining, inference, and prediction. New York, NY: Springer.
  • Hayashi, Y., Hsieh, M-H., & Setiono, R. (2009). Predicting Consumer Preference for Fast-Food Franchises: A Data Mining Approach. The Journal of the Operational Research Society, 60(9), 1221-1229.
  • Haykin, S. (1999). Neural Networks: A Comprehensive Foundation. 2nd Edition, Prentice- Hall, Englewood Cliffs, NJ.
  • Holmstrom, L., & Koistinen, P. (1992). Using additive noise in back-propagation training. IEEE Trans. Neural Networks, 3, 24–38
  • Hua, J.P., Lowey, J., Xiong, Z., & Dougherty, E.R. (2006). Noise-injected neural networks show promise for use on small-sample expression data. BMS Bioinform. 7 (Art. no. 274).
  • Hu, X. (2003). DB-H Reduction: A Data Preprocessing Algorithm for Data Mining Applications. Applied Math. Letters, 16, 889- 895.
  • Hunt, K.J., Sbarbaro, D., Bikowski, R., & Gawthrop, P.J. (1992) “Neural Networks for Control Systems - A Survey. Automatica, 28, pp. 1083-1112.
  • Karasar, N. (2009). Bilimsel Araştırma Yöntemi [Scientific Research Method]. Ankara: Nobel Yayıncılık.
  • Klein, B.D., & Rossin, D.F. (1999). Data Quality in Neural Network Models: Effect of Error Rate and Magnitude of Error on Predictive Accuracy. OMEGA, The Int. J. Management Science, 27, pp. 569-582.
  • Kriesel, D. (2007). A Brief Introduction to Neural Networks. Available at http://www.dkriesel.com/_media/science/neuronalenetze-en-zeta2-2col-dkrieselcom.pdf
  • Krycha, K. A., & Wagner, U. (1999). Applications of Artificial Neural Networks in Management Science: A Survey. J. Retailing and Consumer Services, 6, pp. 185-203,
  • Lawrance, J. (1991). Data Preparation for a Neural Network, AI Expert. 6 (11), 34-41.
  • Lou, M. (1993). Preprocessing Data for Neural Networks. Technical Analysis of Stocks & Commodities Magazine, Oct.
  • Mannila, H. (1996). Data mining: machine learning, statistics, and databases, Proceedings of 8th International Conference on Scientific and Statistical Data Base Management, Stockholm, Sweden, June 18–20, 1996.
  • Matlab (2002). Matlab, Version 6·5. Natick, MA: The Mathworks Inc.,
  • Mustaffa, Z., & Yusof, Y. (2011). A Comparison of Normalization Techniques in Predicting Dengue Outbreak. International Conference on Business and Economics Research, Vol.1 IACSIT Press, Kuala Lumpur, Malaysia
  • Namin, A. H., Leboeuf, K., Wu, H., & Ahmadi, M. (2009). Artificial Neural Networks Activation Function HDL Coder, Proceedings of IEEE International Conference on Electro/Information Technology, Ontario, Canada, 7-9 June, 2009.
  • Narendra, K. S., & Parthasarathy, K. (1990). Identification and Control of Dynamic Systems Using Neural Networks. IEEE Trans. Neural Networks, 1, pp. 4-27.
  • Nawi, N. M., Atomi, W. H., Rehman, M. Z. (2013). The Effect of Data Pre-Processing on Optimized Training of Artificial Neural Networks. Procedia Technology, 11, 32-39.
  • Neelamegam, S., & Ramaraj, E. (2013). Classification algorithm in Data mining: An Overview. International Journal of P2P Network Trends and Technology (IJPTT), 4(8), 369-374.
  • OECD, (2015). Frascati Manual 2015: Guidelines for Collecting and Reporting Data on Research and Experimental Development, The Measurement of Scientific and Technical Activities, OECD Publishing, Paris.
  • O’Shea, K., & Nash, R. (2015). An Introduction to Convolutional Neural Networks, arXiv:1511.08458 [cs. NE], November.
  • Özkan, A.O. (2017). Effect of Normalization Techniques on Multilayer Perceptron Neural Network Classification Performance for Rheumatoid Arthritis Disease Diagnosis. International Journal of Trend Scientific Research and Development. Volume 1, Issue 6.
  • Öztemel, E. (2003), Yapay Sinir Ağları [Artificial Neural Networks], İstanbul: Papatya Yayıncılık.
  • Rafiq, M.Y., Bugmann, G., & Easterbrook, D.J. (2001). Neural Network Design for Engineering Applications. Computers & Structures, 79, pp. 1541-1552.
  • Ravid, R. (2011). Practical statistics for educators (fourth edition). United States: Rowman & Littlefield Publishers.
  • Redman, T. C. (1992). Data Quality: Management and Technology. New York: Bantam Books.
  • Ripley, B.D. (1996), Pattern Recognition and Neural Networks, Cambridge: Cambridge University Press.
  • Romero, C., Ventura, S. (2011). Educational data mining: a review of the state-of-the-art”, IEEE Trans. Syst. Man Cybernet. C Appl. Rev., 40(6), 601–618.
  • Roussas, G. (2007). Introduction to probability (first edition). United States: Elsevier Academic Press.
  • Rumelhart, D.E. (1994). The Basic Ideas in Neural Networks. Comm. ACM, 37, pp. 87-92.
  • Panigrahi, S., & Behera, H. S. (2013). Effect of Normalization Techniques on Univariate Time Series Forecasting using Evolutionary Higher Order Neural Network. International Journal of Engineering and Advanced Technology, 3(2), 280-285.
  • Sattler, K.U., & Schallehn, E. (2001). A Data Preparation Framework Based on a Multidatabase Language. Proc. Int’l Symp. Database Eng. & Applications, pp. 219-228.
  • Schmidhuber, J. (2015). Deep Learning in Neural Networks: An Overview. Neural Networks, 61, 85-117.
  • Schumacher, P., Olinsky, A., Quinn, J., & Smith, R. (2010). A Comparison of Logistic Regression, Neural Networks, and Classification Trees Predicting Success of Actuarial Students. Journal of Education for Business, 85(5), 258-263.
  • Silva, C.S. and Fonseca, J.M. (2017). Educational Data Mining: a literature review. Advances in Intelligent Systems and Computing, 2-9.
  • Stein, R. (1993). Selecting data for neural networks, AI Expert.
  • Suma, V. R., Renjith, S., Ashok, S., & Judy, M. V. (2016). Analytical Study of Selected Classification Algorithms for Clinical Dataset. Indian Journal of Science and Technology, 9(11), 1-9, DOI: 10.17485/ijst/2016/v9i11/67151.
  • Upadhyay, N. (2016). Educational Data Mining by Using Neural Network. International Journal of Computer Applications Technology and Research, 5(2), 104-109.
  • Uslu, M. (2013). Yapay Sinir Ağları ile Sınıflandırma[Classification with Artificial Neural Networks], İleri İstatistik Projeleri I [Advanced Statistics Projects I]. Hacettepe Üniversitesi Fen Fakültesi İstatistik Bölümü, Ankara.
  • Vijayabhanu, R. & Radha, V. (2013). Dynamic Score Normalization Technique using Mahalonobis Distance to Predict the Level of COD for an Anaerobic Wastewater Treatment System. The International Journal of Computer Science & Applications. 2(3), May 2013, ISSN – 2278-1080.
  • Yavuz, S., & Deveci, M. (2012). İstatiksel Normalizasyon Tekniklerinin Yapay Sinir Ağın Performansına Etkisi. [The Effect of Statistical Normalization Techniques on The Performance of Artificial Neural Network], Erciyes University Journal of Faculty of Economics and Administrative Sciences, 40, 167-187.
  • Yu, L., Wang, S., & Lai, K.K. (2006). An integrated data preparation scheme for neural network data analysis. IEEE Trans. Knowl. Data Eng., 18, 217–230.
  • Wang, F., Devabhaktuni, V.K.Xi, C., & Zhang, Q. (1998). Neural Network Structures and Training Algorithms for RF and Microwave Applications. John Wiley & Sons, Inc. Int J RF and Microwave CAE, 9, 216-240.
  • Wook, M., Yahaya, Y. H., Wahab, N., Isa, M. R. M., Awang, N. F., Seong, H. Y. (2009). Predicting NDUM Student's Academic Performance Using Data Mining Techniques, The Second International Conference on Computer and Electrical Engineering, Dubai, United Arab Emirates, 28-30 December, 2009.
  • Zhang, S., Zhang, C., & Yang, Q. (2003). Data Preparation for Data Mining. Applied Artificial Intelligence, 17, 375-381.
  • Zur, R.M., Jiang, Y.L., Pesce, L.L., &bDrukker, K. (2009). Noise injection for training artificial neural networks: a comparison with weight decay and early stopping. Med. Phys., 36(10), 4810–4818.

The Effect of the Normalization Method Used in Different Sample Sizes on the Success of Artificial Neural Network Model

Year 2019, , 170 - 192, 15.07.2019
https://doi.org/10.21449/ijate.479404

Abstract

In this study, it was aimed to compare different normalization methods employed in model developing process via artificial neural networks with different sample sizes. As part of comparison of normalization methods, input variables were set as: work discipline, environmental awareness, instrumental motivation, science self-efficacy, and weekly science learning time that have been covered in PISA 2015, whereas students' Science Literacy level was defined as the output variable. The amount of explained variance and the statistics about the correct classification ratios were used in the comparison of the normalization methods discussed in the study. The dataset was analyzed in Matlab2017b software and both prediction and classification algorithms were used in the study. According to the findings of the study, adjusted min-max normalization method yielded better results in terms of the amount of explained variance in different sample sizes compared to other normalization methods; no significant difference was found in correct classification rates according to the normalization method of the data, which lacked normal distribution and the possibility of overfitting should be taken into consideration when working with small samples in the modelling process of artificial neural network. In addition, it was also found that sample size had a significant effect on both classification and prediction analyzes performed with artificial neural network methods. As a result of the study, it was concluded that with a sample size over 1000, more consistent results can be obtained in the studies performed with artificial neural networks in the field of education.

References

  • Aksu, G., & Doğan, N. (2018). Veri Madenciliğinde Kullanılan Öğrenme Yöntemlerinin Farklı Koşullar Altında Karşılaştırılması, Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi, 51(3), 71-100.
  • Ali, A. & Senan, N. (2017). The Effect of Normalization in VIOLENCE Video Classification Performance. IOP Conf. Ser.: Mater. Sci. Eng. 226 012082.
  • Anderson, J. A. (1990). Data Representation in Neural Networks, AI Expert.
  • Ayalakshmi, T., & Santhakumaran, A. (2011). Statistical Normalization and Back Propagation for Classification. International Journal of Computer Theory and Engineering, 3(1), 1793-8201.
  • Azadeh, M., Sheikhalishahi, M., Tabesh, A., & Negahban (2011). The Effects of Pre-Processing Methods on Forecasting Improvement of Artificial Neural Networks, Australian Journal of Basic and Applied Sciences, 5(6), 570-580.
  • Azimi-Sadjadi, M.R. & Stricker, S.A. (1994). “Detection and Classification of Buried Dielectric Anomalies Using Neural Networks Further Results,” IEEE Trans. Instrumentations and Measurement, 43, pp. 34-39.
  • Bishop, C. M. (1995), Neural Networks for Pattern Recognition, Oxford: Oxford University Press.
  • Cihan, P., Kalıpsız, O., & Gökçe, E. (2017). Hayvan Hastalığını Teşhisinde Normalizasyon Tekniklerinin Yapay Sinir Ağı Performansına Etkisi [Effect of Normalization Techniques on Artificial Neural Network and Feature Selection Performance in Animal Disease Diagnosis]. e-Turkish Studies (elektronik), 12(11), 59-70, 2017.
  • Davydov, M.V., Osipov, A.N., Kilin, S.Y. & Kulchitsky, V.A. (2018). Neural Network Structures: Current and Future States. Open semantic technologies for intelligent systems, 259-264.
  • Dekking, F.M., Kraaikamp, C., Lopuhaä, H.P., & Meester, L.E. (2005). A modern introduction to probability and statistics: Understanding why and how. United States: Springer-Verlag London Limited.
  • Deveci, M. (2012). Yapay Sinir Ağları ve Bekleme Süresinin Tahmininde Kullanılması [Artificial Neural Networks and Used of Waiting Time Estimation]. Unpublished Master Dissertation, Gazi Üniversitesi Sosyal Bilimleri Enstitüsü, Ankara.
  • Elmas, Ç. (2003). Yapay Sinir Ağları, Birinci Baskı, Ankara: Seçkin Yayıncılık.
  • Famili, A., Shen, W., Weber, R., & Simoudis, E. (1997). Data Preprocessing and Intelligent Data Analysis. Intelligent Data Analysis, 1, 3-23.
  • Finch, J. F., West, S. G., & MacKinnon, D. P. (1997). Effects of sample size and nonnormality on the estimation of mediated effects in latent variable models. Structural Equation Modeling: A Multidisciplinary Journal, 4(2), 87-107.
  • Fraenkel, J.R., & Wallen, N.E. (2006). How to design and evaluate research in education (6th ed.). New York, NY: McGraw-Hill.
  • Gardner, M. W., & Dorling, S. R. (1998). Artificial Neural Networks (The Multilayer Perceptron) - A Review of Applications in the Atmospheric Sciences. Atmospheric Environment, 32, 2627-2636.
  • Gerasimovic, M., Stanojevic, L., Bugaric, U., Miljkovic, Z., & Veljovic, A. (2011). Using Artificial Neural Networks for Predictive Modeling of Graduates’ Professional Choice. The New Educational Review, 23, 175- 188.
  • Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning. MIT Press.
  • Gonzalez, J.M., & DesJardins, S.L. (2002). Artificial neural networks: A new approach to predicting application behaviour, Research in Higher Education, 43(2), 235–258
  • Gschwind, M. (2007). Predicting Late Payments: A Study in Tenant Behavior Using Data Mining Techniques. The Journal of Real Estate Portfolio Management, 13(3), 269-288.
  • Hagan, M.T., Demuth, H.B., Beale, M.H., & Jesus, O. (2014). Neural Network Design, Boston: PWS Publishing Co.
  • Hastie, T., Tibshirani, R., & Friedman, J. H. (2009). The elements of statistical learning: Data mining, inference, and prediction. New York, NY: Springer.
  • Hayashi, Y., Hsieh, M-H., & Setiono, R. (2009). Predicting Consumer Preference for Fast-Food Franchises: A Data Mining Approach. The Journal of the Operational Research Society, 60(9), 1221-1229.
  • Haykin, S. (1999). Neural Networks: A Comprehensive Foundation. 2nd Edition, Prentice- Hall, Englewood Cliffs, NJ.
  • Holmstrom, L., & Koistinen, P. (1992). Using additive noise in back-propagation training. IEEE Trans. Neural Networks, 3, 24–38
  • Hua, J.P., Lowey, J., Xiong, Z., & Dougherty, E.R. (2006). Noise-injected neural networks show promise for use on small-sample expression data. BMS Bioinform. 7 (Art. no. 274).
  • Hu, X. (2003). DB-H Reduction: A Data Preprocessing Algorithm for Data Mining Applications. Applied Math. Letters, 16, 889- 895.
  • Hunt, K.J., Sbarbaro, D., Bikowski, R., & Gawthrop, P.J. (1992) “Neural Networks for Control Systems - A Survey. Automatica, 28, pp. 1083-1112.
  • Karasar, N. (2009). Bilimsel Araştırma Yöntemi [Scientific Research Method]. Ankara: Nobel Yayıncılık.
  • Klein, B.D., & Rossin, D.F. (1999). Data Quality in Neural Network Models: Effect of Error Rate and Magnitude of Error on Predictive Accuracy. OMEGA, The Int. J. Management Science, 27, pp. 569-582.
  • Kriesel, D. (2007). A Brief Introduction to Neural Networks. Available at http://www.dkriesel.com/_media/science/neuronalenetze-en-zeta2-2col-dkrieselcom.pdf
  • Krycha, K. A., & Wagner, U. (1999). Applications of Artificial Neural Networks in Management Science: A Survey. J. Retailing and Consumer Services, 6, pp. 185-203,
  • Lawrance, J. (1991). Data Preparation for a Neural Network, AI Expert. 6 (11), 34-41.
  • Lou, M. (1993). Preprocessing Data for Neural Networks. Technical Analysis of Stocks & Commodities Magazine, Oct.
  • Mannila, H. (1996). Data mining: machine learning, statistics, and databases, Proceedings of 8th International Conference on Scientific and Statistical Data Base Management, Stockholm, Sweden, June 18–20, 1996.
  • Matlab (2002). Matlab, Version 6·5. Natick, MA: The Mathworks Inc.,
  • Mustaffa, Z., & Yusof, Y. (2011). A Comparison of Normalization Techniques in Predicting Dengue Outbreak. International Conference on Business and Economics Research, Vol.1 IACSIT Press, Kuala Lumpur, Malaysia
  • Namin, A. H., Leboeuf, K., Wu, H., & Ahmadi, M. (2009). Artificial Neural Networks Activation Function HDL Coder, Proceedings of IEEE International Conference on Electro/Information Technology, Ontario, Canada, 7-9 June, 2009.
  • Narendra, K. S., & Parthasarathy, K. (1990). Identification and Control of Dynamic Systems Using Neural Networks. IEEE Trans. Neural Networks, 1, pp. 4-27.
  • Nawi, N. M., Atomi, W. H., Rehman, M. Z. (2013). The Effect of Data Pre-Processing on Optimized Training of Artificial Neural Networks. Procedia Technology, 11, 32-39.
  • Neelamegam, S., & Ramaraj, E. (2013). Classification algorithm in Data mining: An Overview. International Journal of P2P Network Trends and Technology (IJPTT), 4(8), 369-374.
  • OECD, (2015). Frascati Manual 2015: Guidelines for Collecting and Reporting Data on Research and Experimental Development, The Measurement of Scientific and Technical Activities, OECD Publishing, Paris.
  • O’Shea, K., & Nash, R. (2015). An Introduction to Convolutional Neural Networks, arXiv:1511.08458 [cs. NE], November.
  • Özkan, A.O. (2017). Effect of Normalization Techniques on Multilayer Perceptron Neural Network Classification Performance for Rheumatoid Arthritis Disease Diagnosis. International Journal of Trend Scientific Research and Development. Volume 1, Issue 6.
  • Öztemel, E. (2003), Yapay Sinir Ağları [Artificial Neural Networks], İstanbul: Papatya Yayıncılık.
  • Rafiq, M.Y., Bugmann, G., & Easterbrook, D.J. (2001). Neural Network Design for Engineering Applications. Computers & Structures, 79, pp. 1541-1552.
  • Ravid, R. (2011). Practical statistics for educators (fourth edition). United States: Rowman & Littlefield Publishers.
  • Redman, T. C. (1992). Data Quality: Management and Technology. New York: Bantam Books.
  • Ripley, B.D. (1996), Pattern Recognition and Neural Networks, Cambridge: Cambridge University Press.
  • Romero, C., Ventura, S. (2011). Educational data mining: a review of the state-of-the-art”, IEEE Trans. Syst. Man Cybernet. C Appl. Rev., 40(6), 601–618.
  • Roussas, G. (2007). Introduction to probability (first edition). United States: Elsevier Academic Press.
  • Rumelhart, D.E. (1994). The Basic Ideas in Neural Networks. Comm. ACM, 37, pp. 87-92.
  • Panigrahi, S., & Behera, H. S. (2013). Effect of Normalization Techniques on Univariate Time Series Forecasting using Evolutionary Higher Order Neural Network. International Journal of Engineering and Advanced Technology, 3(2), 280-285.
  • Sattler, K.U., & Schallehn, E. (2001). A Data Preparation Framework Based on a Multidatabase Language. Proc. Int’l Symp. Database Eng. & Applications, pp. 219-228.
  • Schmidhuber, J. (2015). Deep Learning in Neural Networks: An Overview. Neural Networks, 61, 85-117.
  • Schumacher, P., Olinsky, A., Quinn, J., & Smith, R. (2010). A Comparison of Logistic Regression, Neural Networks, and Classification Trees Predicting Success of Actuarial Students. Journal of Education for Business, 85(5), 258-263.
  • Silva, C.S. and Fonseca, J.M. (2017). Educational Data Mining: a literature review. Advances in Intelligent Systems and Computing, 2-9.
  • Stein, R. (1993). Selecting data for neural networks, AI Expert.
  • Suma, V. R., Renjith, S., Ashok, S., & Judy, M. V. (2016). Analytical Study of Selected Classification Algorithms for Clinical Dataset. Indian Journal of Science and Technology, 9(11), 1-9, DOI: 10.17485/ijst/2016/v9i11/67151.
  • Upadhyay, N. (2016). Educational Data Mining by Using Neural Network. International Journal of Computer Applications Technology and Research, 5(2), 104-109.
  • Uslu, M. (2013). Yapay Sinir Ağları ile Sınıflandırma[Classification with Artificial Neural Networks], İleri İstatistik Projeleri I [Advanced Statistics Projects I]. Hacettepe Üniversitesi Fen Fakültesi İstatistik Bölümü, Ankara.
  • Vijayabhanu, R. & Radha, V. (2013). Dynamic Score Normalization Technique using Mahalonobis Distance to Predict the Level of COD for an Anaerobic Wastewater Treatment System. The International Journal of Computer Science & Applications. 2(3), May 2013, ISSN – 2278-1080.
  • Yavuz, S., & Deveci, M. (2012). İstatiksel Normalizasyon Tekniklerinin Yapay Sinir Ağın Performansına Etkisi. [The Effect of Statistical Normalization Techniques on The Performance of Artificial Neural Network], Erciyes University Journal of Faculty of Economics and Administrative Sciences, 40, 167-187.
  • Yu, L., Wang, S., & Lai, K.K. (2006). An integrated data preparation scheme for neural network data analysis. IEEE Trans. Knowl. Data Eng., 18, 217–230.
  • Wang, F., Devabhaktuni, V.K.Xi, C., & Zhang, Q. (1998). Neural Network Structures and Training Algorithms for RF and Microwave Applications. John Wiley & Sons, Inc. Int J RF and Microwave CAE, 9, 216-240.
  • Wook, M., Yahaya, Y. H., Wahab, N., Isa, M. R. M., Awang, N. F., Seong, H. Y. (2009). Predicting NDUM Student's Academic Performance Using Data Mining Techniques, The Second International Conference on Computer and Electrical Engineering, Dubai, United Arab Emirates, 28-30 December, 2009.
  • Zhang, S., Zhang, C., & Yang, Q. (2003). Data Preparation for Data Mining. Applied Artificial Intelligence, 17, 375-381.
  • Zur, R.M., Jiang, Y.L., Pesce, L.L., &bDrukker, K. (2009). Noise injection for training artificial neural networks: a comparison with weight decay and early stopping. Med. Phys., 36(10), 4810–4818.
There are 68 citations in total.

Details

Primary Language English
Subjects Studies on Education
Journal Section Articles
Authors

Gökhan Aksu 0000-0003-2563-6112

Cem Oktay Güzeller 0000-0002-2700-3565

Mehmet Taha Eser 0000-0001-7031-1953

Publication Date July 15, 2019
Submission Date November 7, 2018
Published in Issue Year 2019

Cite

APA Aksu, G., Güzeller, C. O., & Eser, M. T. (2019). The Effect of the Normalization Method Used in Different Sample Sizes on the Success of Artificial Neural Network Model. International Journal of Assessment Tools in Education, 6(2), 170-192. https://doi.org/10.21449/ijate.479404

Cited By































A data‐driven assessment of mobile operator service quality in Ghana
THE ELECTRONIC JOURNAL OF INFORMATION SYSTEMS IN DEVELOPING COUNTRIES
https://doi.org/10.1002/isd2.12312



















Gayrimenkul Değerlemesinde Yapay Sinir Ağlarının Kullanılması
International Journal of Advanced Natural Sciences and Engineering Researches
https://doi.org/10.59287/ijanser.721


































Transfer Learning With CNNs for Segmentation of PALSAR-2 Power Decomposition Components
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Poliyapram Vinayaraj
https://doi.org/10.1109/JSTARS.2020.3031020

23823             23825             23824