Research Article
BibTex RIS Cite

Computational Thinking in Early Childhood: Psychometric Properties of the BD-HAM (Computational thinking: Debugging, Algorithmic Thinking, Modularity) Test with Rasch Model

Year 2026, Issue: Advanced Online Publication, 276 - 296
https://doi.org/10.17275/per.26.15.13.1

Abstract

This study provides evidence for the validity and reliability of the Computational Thinking: Debugging, Algorithmic Thinking, Modularity Test (BD-HAM) with data collected from 195 children aged 48-72 months with the Rasch model within the scope of Item Response Theory. For this purpose, tetrachoric factor analysis was initially implemented on the BD-HAM with Robust Diagonally Weighted Least Squares, thereby providing psychometric substantiation for the unidimensional configuration of the assessment tool. Following the factor analysis, the remaining item loadings and the reliability indices were appropriate for data scored 0-1. Furthermore, the results of the quality and efficiency analyses of the test were confirmed. The findings indicated adequate internal consistency. Following the Rasch calibration, it was ascertained that the model, item, and person level fits were satisfactory. The reliability of the Rasch model was analyzed using the test information function curve and the marginal reliability function, which demonstrated that the test exhibited significant reliability at the average ability level. A differential item functioning analysis was conducted, and the results indicated that the developed test demonstrated adequate functionality across different genders. Further analysis revealed no significant difference in scores between boys and girls. The findings indicated that the 16-item BD-HAM test can serve as a reliable and valid tool for assessing the computational thinking skills of children aged 48-72 months.

Ethical Statement

This study was reviewed and approved by the the Hacettepe University Social Sciences and Humanities Ethics Committee. The protocol number is E-77278857-900-00003803900. In the informed consent, parents and children were informed about the purpose, procedures, potential risks and benefits of the study and were informed that they could withdraw from the study at any time. To protect confidentiality, codes were used instead of child names and school names.

Supporting Institution

This study is part of a corresponding author’s PhD dissertation. The corresponding author was receiving a 100/2000 priority area doctoral scholarship from the Council of Higher Education of Turkiye (YÖK) during the preparation of the article.

References

  • Aksu-Dünya, B. (2025). Rasch modeli. In N. Güler, B. Atar, & K. Atalay Kabasakal (Eds.), R ile psikometri etkileşimli e-kitap [Psychometrics with R interactive e-book] (vol. 1, chapter 6). Pegem Akademy Publ. https://ekitap02.pegem.net/rasch-modeli.html
  • Aghekyan, R. (2020). Validation of the SIEVEA instrument using the Rasch analysis. International Journal of Educational Research, 103, 101619. doi: 10.1016/j.ijer.2020.101619
  • Andersson, B., & Xin, T. (2018). Large sample confidence intervals for item response theory reliability coefficients. Educational and Psychological Measurement, 78(1), 32-45. Doi: 10.1177/0013164417713570
  • Andrich, D. (1988). Rasch models for measurement: Sage publications. Sage Publications.
  • Angeli, C., Voogt, J., Fluck, A., Webb, M., Cox, M., Malyn-Smith, J., & Zagami, J. (2016). A K-6 computational thinking curriculum framework: Implications for teacher knowledge. Journal of Educational Technology & Society, 19(3), 47-57. https://www.jstor.org/stable/10.2307/jeductechsoci.19.3.47
  • Arfé, B., Vardanega, T., Montuori, C., & Lavanga, M. (2019). Coding in primary grades boosts children’s executive functions. Frontiers in Psychology, 10, 478953. Doi: 10.3389/fpsyg.2019.02713
  • Arslantaş, T. K. (2024). Theoretical framework for integrating computational thinking in education. Saritepeci, M., Yildiz Durak, H. (Eds.) Integrating computational thinking through design-based learning (pp. 15-31). Springer, Singapore. Doi: 10.1007/978-981-96-0853-9_2
  • Aryadoust, V., Ng, L. Y., & Sayama, H. (2021). A comprehensive review of Rasch measurement in language assessment: Recommendations and guidelines for research. Language Testing, 38(1), 6-40. Doi: 10.1177/0265532220927487
  • Baghaei, P. (2008). The effects of the rhetorical organization of texts on the C-Test construct: A Rasch modelling study. Melbourne Papers in Language Testing, 13(2), 32-51. https://arts.unimelb.edu.au/__data/assets/pdf_file/0004/3518689/13_2_2-Baghaei.pdf
  • Bers, M. (2021). Teaching Computational Thinking and Coding to Young Children. IGI Global. doi: 10.4018/978-1-7998-7308-2
  • Bers, M. U. (2018). Coding and computational thinking in early childhood: The impact of ScratchJr in Europe. European Journal of STEM Education, 3(3). Doi: 10.20897/ejsteme/3868
  • Bers, M. U., Flannery, L., Kazakoff, E. R., & Sullivan, A. (2014). Computational thinking and tinkering: Exploration of an early childhood robotics curriculum. Computers & Education, 72, 145-157. Doi: 10.1016/j.compedu.2013.10.020
  • Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Lawrence Erlbaum Associates Publishers.
  • Bond, T. G., & Fox, C. M. (2015). Applying the Rasch model: Fundamental measurement in the human sciences (3rd ed.). Routledge/Taylor & Francis Group. Doi: 10.4324/9781315814698
  • Boone, W. J., & Scantlebury, K. (2006). The role of Rasch analysis when conducting science education research utilizing multiple‐choice tests. Science Education, 90(2), 253-269. Doi: 10.1002/sce.20106
  • Boone, W. J., Townsend, J. S., & Staver, J. (2011). Using Rasch theory to guide the practice of survey development and survey data analysis in science education and to inform science reform efforts: An exemplar utilizing STEBI self‐efficacy data. Science Education, 95(2), 258-280. doi:10.1002/sce.20413
  • Borsboom, D. (2005). Measuring the mind: Conceptual issues in contemporary psychometrics. Cambridge University Press. Doi:10.1017/CBO9780511490026
  • Bühner, M. (2011). Einführung in die Test-und Fragebogenkonstruktion [Introduction to test and questionnaire construction]. Deutschland: Pearson GmbH. https://www.researchgate.net/publication/28358158_Einfuhrung_in_die_Test-_und_Fragebogenkonstruktion
  • Büyüköztürk, Ş., Kılıç, Ç. E., Akgün, Ö. E., Karadeniz, Ş. & Demirel, F. (2018). Bilimsel araştırma yöntemleri [Scientific research methods]. Ankara: Pegem Akademy Publ. doi:10.14527/9789944919289
  • Çetin, İ., Şendurur, P., & Tarık, O. (2022). Tech Check isimli bilgi işlemsel düşünme testlerinin türkçeye uyarlanması [Adaptation of Tech Check computational thinking tests into Turkish]. Journal of Instructional Technologies and Teacher Education, 11(2), 16-27. doi:10.51960/jitte.1102904
  • Chen, W. H., & Revicki, D. (2024). Differential item functioning (DIF). In Encyclopedia of quality of life and well-being research. Springer. Doi:10.1007/978-3-031-17299-1_728
  • CSTA (2017). K-12 CS standards for all. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (pp. 730-730). https://doi.org/10.1145/3017680.3022341
  • CSTA, & ISTE (2011). Operational Definition of Computational Thinking for K-12 Education. http://www.iste.org/docs/pdfs/Operational-Definition-of-Computational-Thinking.pdf
  • Dağ, F., & Doğan, B. (2025). Bridging a gap: Developing computational thinking knowledge and skills for kindergarten pre-service teachers. Journal of Early Childhood Teacher Education, 1-25. https://doi.org/10.1080/10901027.2025.2508962
  • De Ruiter, L. E., & Bers, M. U. (2022). The Coding Stages Assessment: development and validation of an instrument for assessing young children’s proficiency in the ScratchJr programming language. Computer Science Education, 32(4), 388-417. https://doi.org/10.1080/08993408.2021.1956216
  • Debelak, R., & Strobl, C. (2024). Violations of unidimensionality and differential item functioning. In S. Greiff, K. Schweizer, & S. Troche (Eds.), Method effects in the psychological measurement. Hogrefe. https://doi.org/10.48550/arXiv.2409.19522
  • Değirmenci, Ş. (2022). Kodlama eğitim programının 5 yaş grubu çocukların bilişsel esneklik ve bilgi-işlemsel düşünme becerilerine etkisinin incelenmesi [Examining the effect of coding education program on cognitive flexibility and computational thinking skills of 5-year-old children]. (Unpublished doctoral thesis). Marmara University, İstanbul. https://tez.yok.gov.tr/UlusalTezMerkezi/tezSorguSonucYeni.jsp
  • Dogan, C. & Aybek, E. (2023). RSP: 'shiny' applications for statistical and psychometric analysis_. R package version 0.4, <https://CRAN.R-project.org/package=RSP>.
  • Ebel, R.L. & Frisbie, D.A. (1991) Essentials of educational measurement. 5th Edition, Prentice-Hall, Englewood Cliffs. https://ebookppsunp.wordpress.com/wp-content/uploads/2016/06/robert_l-ebel_david_a-_frisbie_essentials_of_edbookfi-org.pdf
  • Faber, H. H., Koning, J. I., Wierdsma, M. D. M., Steenbeek, H. W., Barendsen, E. (2019). Observing Abstraction in Young Children Solving Algorithmic Tasks. In: Pozdniakov, S., Dagienė, V. (eds) Informatics in Schools. New ideas in school informatics. Lecture Notes in Computer Science, Springer, Cham. https://doi.org/10.1007/978-3-030-33759-9_8
  • Fan, J., & Bond, T. (2019). Applying Rasch measurement in language assessment: Unidimensionality and local independence. In Quantitative Data Analysis for Language Assessment (pp. 83-102). Routledge.
  • Ferrando, P. J., & Lorenzo-Seva U. (2018). Assessing the quality and appropriateness of factor solutions and factor score estimates in exploratory item factor analysis. Educational and Psychological Measurement, 78, 762-780. doi:10.1177/0013164417719308
  • Ferrando, P. J., & Lorenzo-Seva, U. (2017). Program FACTOR at 10: Origins, development and future directions. Psicothema, 29(2), 236-240. doi:10.7334/psicothema2016.304
  • Ferrando, P. J., & Lorenzo-Seva, U. (2016). A note on improving EAP trait estimation in oblique factor-analytic and item response theory models. Psicológica, 37(2), 235-247. https://psycnet.apa.org/record/2016-34732-007
  • Fortus, D., & Vedder‐Weiss, D. (2014). Measuring students' continuing motivation for science learning. Journal of Research in Science Teaching, 51(4), 497-522. https://doi.org/10.1002/tea.21136
  • Hagquist, C., Bruce, M., & Gustavsson, J. P. (2009). Using the Rasch model in nursing research: An introduction and illustrative example. International Journal of Nursing studies, 46(3), 380-393. https://doi.org/10.1016/j.ijnurstu.2008.10.007
  • Haladyna, T., & Hess, R. (1999). An evaluation of conjunctive and compensatory standard-setting strategies for test decisions. Educational Assessment, 6(2), 129-153. https://doi.org/10.1207/S15326977EA0602_03
  • Hambleton, R. K., & Swaminathan, H. (1985). Estimation of item and ability parameters. In Item response theory: Principles and applications (pp. 125-150). Dordrecht: Springer.
  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory (Vol. 2). Sage.
  • Hendriks, J., Fyfe, S., Styles, I., Skinner, S., & Merriman, G. (2012). Scale construction utilising the Rasch unidimensional measurement model: A measurement of adolescent attitudes towards abortion. The Australasian Medical Journal, 5(5), 251. doi: 10.4066/AMJ.2012.952
  • Hsu, T. C., Chang, S. C., & Hung, Y. T. (2018). How to learn and how to teach computational thinking: Suggestions based on a review of the literature. Computers & Education, 126, 296-310. https://doi.org/10.1016/j.compedu.2018.07.004
  • Irribarra, T. D. & Freund, R. (2024). _WrightMap: IRT Item-Person Map with 'ConQuest' integration_. R package version 1.4,<https://CRAN.R-project.org/package=WrightMap>.
  • Kim, H., Choi, H., Han, J., & So, H.-J. (2012). Enhancing teachers’ ICT capacity for the 21st-century learning environment: Three cases of teacher education in Korea. Australasian Journal of Educational Technology, 28(6), 965–982. https://doi.org/10.14742/ajet.805
  • Kong, S. C., & Lai, M. (2022). Validating a computational thinking concepts test for primary education using item response theory: An analysis of students’ responses. Computers & Education, 187, 104562. https://doi.org/10.1016/j.compedu.2022.104562
  • Lavigne, H., Presser, A. L., Rosenfeld, D., Wolsky, M., & Andrews, J. (2020). Creating a preschool computational thinking learning blueprint to guide the development of learning resources for young children. Connected Science Learning, 2(2). https://www.nsta.org/connected-science-learning/connected-science-learning-april-june-2020/creating-preschool?srsltid=AfmBOooarZPScmlhmQsG-bG8rIQrSAW2d1qS5wtmJdn2uX08QDysdWe1
  • Linacre, J. (2002) What Do Infit and Outfit, Mean-Square and Standardized Mean? Rasch Measurement Transactions, 16, 878. https://www.rasch.org/rmt/rmt162f.htm
  • Linacre, J. M. (1994). Sample size and item calibration stability. Rasch Measurement Transactions, 7, 328. https://www.rasch.org/rmt/rmt74m.htm
  • Linacre, J. M. (2017). Winsteps. Beaverton: Oregon. http://www. winsteps. com.
  • Mair, P. (2018). Modern psychometrics with R (Vol. 10, pp. 978-3). Springer International Publishing. https://doi.org/10.1007/978-3-319-93177-7
  • Mair, P., Hatzinger, R., & Maier, M. J. (2019). eRm: Extended Rasch Modeling. R package version 1.0-2. URL https://CRAN.R-project.org/package=eRm.
  • Marinus, E., Powell, Z., Thornton, R., McArthur, G., & Crain, S. (2018). Unravelling the cognition of coding in 3-to-6-year olds: The development of an assessment tool and the relation between coding ability and cognitive compiling of syntax in natural language. In Proceedings of the 2018 ACM conference on international computing education research (pp. 133-141). https://doi.org/10.1145/3230977.3230984
  • Metin, Ş., Başaran, M., Seheryeli, M. Y., Relkin, E., & Kalyenci, D. (2024). Adaptation of the computational thinking skills assessment tool (TechCheck-K) in early childhood. Journal of Science Education and Technology, 33(3), 365-382. https://doi.org/10.1007/s10956-023-10089-2
  • Millsap, R. E., & Everson, H. T. (1993). Methodology review: Statistical approaches for assessing measurement bias. Applied Psychological Measurement, 17(4), 297-334. https://doi.org/10.1177/014662169301700401
  • Mislevy, R. J., & Stocking, M. L. (1989). A consumer's guide to LOGIST and BILOG. Applied Psychological Measurement, 13(1), 57-75. https://doi.org/10.1177/014662168901300106
  • Mouza, C., Pan, Y. C., Yang, H., & Pollock, L. (2020). A multiyear investigation of student computational thinking concepts, practices, and perspectives in an after-school computing program. Journal of Educational Computing Research, 58(5), 1029-1056. https://doi.org/10.1177/0735633120905605
  • Na, C., Clarke-Midura, J., Shumway, J., van Dijk, W., & Lee, V. R. (2024). Validating a performance assessment of computational thinking for early childhood using item response theory. International Journal of Child-Computer Interaction, 40. https://doi.org/10.1016/j.ijcci.2024.100650
  • Özçelik, D., & Berberoğlu, G. (1989). Ölçmede objektiflik ve Rasch modeli [ Objectivity in measurement and the Rasch model]. Education and Science, 13(72). https://doi.org/10.15390/ES.1989.102
  • Papert, S. (1980). Children, computers, and powerful ideas. Harvester. https://books.google.com.tr/books/about/Mindstorms.html?id=LoDhAAAACAAJ&redir_esc=y
  • Qu, J. R., & Fok, P. K. (2022). Cultivating students’ computational thinking through student–robot interactions in robotics education. International Journal of Technology and Design Education, 32(4). https://doi.org/10.1007/s10798-021-09677-3
  • Rasch, G. (1993). Probabilistic models for some intelligence and attainment tests. MESA Press, 5835 S. Kimbark Ave., Chicago.
  • Relkin, E., & Bers, M. (2021). Techcheck-k: A measure of computational thinking for kindergarten children. In Global Engineering Education Conference (pp. 1696-1702). doi:10.1109/EDUCON46332.2021.9453926
  • Relkin, E., de Ruiter, L. E., & Bers, M. U. (2021). Learning to code and the acquisition of computational thinking by young children. Computers & Education, 169, 104222. https://doi.org/10.1016/j.compedu.2021.104222
  • Revelle, W., & Revelle, M. W. (2015). Package ‘psych’. The comprehensive R archive network, 337(338), 161-165. https://cran.r-project.org/web/packages/psych/psych.pdf
  • Rizopoulos, D. (2006). ltm: An R package for Latent Variable Modelling and Item Response Theory Analyses, Journal of Statistical Software, 17 (5), 1-25. https://doi.org/10.18637/jss.v017.i05
  • Román-González, M., Pérez-González, J. C., & Jiménez-Fernández, C. (2017). Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Computers in Human Behavior, 72, 678-691. https://doi.org/10.1016/j.chb.2016.08.047
  • Saxena, A., Lo, C. K., Hew, K. F., & Wong, G. K. W. (2020). Designing unplugged and plugged activities to cultivate computational thinking: An exploratory study in early childhood education. The Asia-Pacific Education Researcher, 29(1), 55-66. doi:10.1007/s40299-019-00478-w
  • Schermelleh-Engel, K., Moosbrugger, H., & Müller, H. (2003). Evaluating the fit of structural equation models: Tests of significance and descriptive goodness-of-fit measures. Methods of Psychological Research Online, 8(2), 23-74. https://psycnet.apa.org/record/2003-08119-003
  • Selby, C. & Woollard, J. 2013. Computational thinking: the developing definition. http://eprints.soton.ac.uk/356481/
  • Shute, V. J., Sun, C., & Asbell-Clarke, J. (2017). Demystifying computational thinking. Educational Research Review, 22, 142-158. https://doi.org/10.1016/j.edurev.2017.09.003
  • Su, J., & Yang, W. (2023). A systematic review of integrating computational thinking in early childhood education. Computers and Education Open, 4. https://doi.org/10.1016/j.caeo.2023.100122
  • Sullivan, A., & Bers, M. U. (2016). Girls, boys, and bots: Gender differences in young children’s performance on robotics and programming tasks. Journal of Information Technology Education. Innovations in Practice, 15, 145. https://doi.org/10.28945/3547
  • Sumintono, B., & Widhiarso, W. (2015). Aplikasi pemodelan rasch pada assessment pendidikan.Trim Komunikata. https://eprints.um.edu.my/14228/
  • Susongko, P. (2016). Validation of science achievement test with the rasch model. Jurnal Pendidikan IPA Indonesia, 5(2). doi:10.15294/jpii.v5i2.7690
  • Tabachnick B. G., Fidell L. S. (2013) Using multivariate statistics, 6th ed. Boston, MA: Pearson.
  • Tang, X., Yin, Y., Lin, Q., Hadad, R., & Zhai, X. (2020). Assessing computational thinking: A systematic review of empirical studies. Computers & Education, 148. https://doi.org/10.1016/j.compedu.2019.103798
  • Urbina, S. (2014). Essentials of psychological testing. John Wiley & Sons. https://doi.org/10.1002/9781394259458
  • Van Der Linden, W. J., & Hambleton, R. K. (1997). Item response theory: Brief history, common models, and extensions. In Handbook of modern item response theory (pp. 1-28). New York. https://link.springer.com/book/10.1007/978-1-4757-2691-6
  • Veneziano, L., & Hooper, J. (1997). A method for quantifying content validity of health-related questionnaires. American Journal of Health Behavior,21(1), 67–70. https://scholar.google.com/scholar?cluster=9663695570775969111&hl=tr&as_sdt=2005&sciodt=0,5
  • Wang, D., Wang, T., & Liu, Z. (2014). A tangible programming tool for children to cultivate computational thinking. The Scientific World Journal, 2014(1), 428080. doi:10.1155/2014/428080
  • Wang, F. H. (2006). Application of componential IRT model for diagnostic test in a standard-conformant eLearning system. In Sixth IEEE International Conference on Advanced Learning Technologies (pp. 237-241). doi: 10.1109/ICALT.2006.1652414.
  • Wess, R., Klock, H., Siller, HS., Greefrath, G. (2021). Test Quality. In Measuring Professional Competence for the Teaching of Mathematical Modelling. Springer, Cham. https://doi.org/10.1007/978-3-030-78071-5_4
  • Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33-35. https://doi.org/10.1145/1118178.111821
  • Wright, B. D., & Linacre, J. M. (1994). The Rasch model as a foundation for the Lexile Framework. Unpublished manuscript. https://scholar.google.com/scholar?hl=tr&as_sdt=0%2C5&q=Wright%2C+B.+D.%2C+%26+Linacre%2C+J.+M.+%281994%29.+The+Rasch+model+as+a+foundation+for+the+Lexile+Framework.+Unpublished+manuscript.&btnG=
  • Yen, W. M. (1984). Effects of local item dependence on the fit and equating performance of the three-parameter logistic model. Applied Psychological Measurement, 8(2), 125-145. https://doi.org/10.1177/014662168400800201
  • Yu, C., & Muthen, B. (2002). Evaluation of model fit indices for latent variable models with categorical and continuous outcomes. Paper presented of American Educational Research Association, New Orleans. http://www.statmodel.com/download/Yudissertation.pdf
  • Zanon, C., Hutz, C. S., Yoo, H. H., & Hambleton, R. K. (2016). An application of item response theory to psychological test development. Psicologia: Reflexão e Crítica, 29. https://doi.org/10.1186/s41155-016-0040-x
  • Zapata-Cáceres, M., Martín-Barroso, E., & Román-González, M. (2020, April). Computational thinking test for beginners: Design and content validation. In Global Engineering Education Conference (pp. 1905-1914). doi: 10.1109/EDUCON45650.2020.9125368.
  • Zeileis, A. (2024). Examining Exams Using Rasch Models and Assessment of Measurement Invariance. https://doi.org/10.48550/arXiv.2409.19522
  • Zeng, Y., Yang, W., & Bautista, A. (2023). Computational thinking in early childhood education: Reviewing the literature and redeveloping the three-dimensional framework. Educational Research Review, 100520. https://doi.org/10.1016/j.edurev.2023.100520
  • Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF). Ottawa: National Defense Headquarters.
There are 90 citations in total.

Details

Primary Language English
Subjects Specialist Studies in Education (Other)
Journal Section Research Article
Authors

Hazel Sıla Menteş Tanaydın 0000-0003-4524-4335

Berrin Akman 0000-0001-5668-4382

Submission Date August 5, 2025
Acceptance Date October 16, 2025
Early Pub Date December 9, 2025
Published in Issue Year 2026 Issue: Advanced Online Publication

Cite

APA Menteş Tanaydın, H. S., & Akman, B. (2025). Computational Thinking in Early Childhood: Psychometric Properties of the BD-HAM (Computational thinking: Debugging, Algorithmic Thinking, Modularity) Test with Rasch Model. Participatory Educational Research(Advanced Online Publication), 276-296. https://doi.org/10.17275/per.26.15.13.1