EN
Item block position and format effects in e-TIMSS among the low- and high-achieving countries
Abstract
The Trends in International Mathematics and Science Study (TIMSS) was administered via computer, eTIMSS, for the first time in 2019. The purpose of this study was to investigate item block position and item format effect on eighth grade mathematics item easiness in low- and high-achieving countries of eTIMSS 2019. Item responses from Chile, Qatar, and Malaysia which were low-achieving countries as well as Republic of Korea, Chinese Taipei, and Singapore which were high-achieving countries, were used in the study. The block position and item format effects were investigated within explanatory item response theory framework. The results revealed that there was a negative and statistically significant item block position effect in all low-and high-achieving countries, and it is more prominent in the low-achieving countries. As item block increased, students’ probability of giving a correct response to an item decreased. Additionally, the results showed that all high- and low-achieving countries had a negative and significant item format effect in that multiple-choice items appeared easier compared to constructed response items.
Keywords
Ethical Statement
The data is open access so no need Ethics Committee Approval Letter
References
- Bulut O. (2021). eirm: Explanatory item response modeling for dichotomous and polytomous item responses [Computer software]. Available from https://github.com/okanbulut/eirm
- Bulut, O., Gorgun, G., & Yildirim-Erbasli, S. (2021). Estimating explanatory extensions of dichotomous and polytomous Rasch models: The eirm package in R. Psych, 3(3), 308-321. https://doi.org/10.3390/psych3030023
- Bulut, O., Quo, Q., & Gierl, M.J. (2017). A structural equation modeling approach for examining position effects. Large Scale Assessments in Education, 5(8), 2 20. https://doi.org/10.1186/s40536-017-0042-x
- Christiansen, A., & Janssen, R. (2021). Item position effects in listening but not in reading in the European survey of language competences. Educational Assessment, Evaluation and Accountability, 33, 49-69. https://doi.org/10.1007/s11092-020-09335-7
- Cotter, K.E., Centurino, V.A.S., & Mullis, I.V.S. (2020). Developing the TIMSS 2019 mathematics and science achievement instruments. In M.O. Martin, M. von Davier, & I.V.S. Mullis (Eds.), Methods and Procedures: TIMSS 2019 Technical Report (pp. 1.1 1.36). Boston College. https://timssandpirls.bc.edu/timss2019/methods/chapter-1.html
- Debeer, D., & Janssen, R. (2013). Modeling item-position effects within an IRT framework. Journal of Educational Measurement, 50(2), 164 185. https://www.jstor.org/stable/24018105
- Debeer, D., Buchholz, J., & Hartig, J. (2014). Student, school, and country differences in sustained test-taking effort in the 2009 PISA reading assessment. Journal of Educational and Behavioral Statistics, 39(6), 502-523. https://doi.org/10.3102/1076998614558485
- De Boeck, P., Bakker, M., Zwitser, R., Nivard, M., Hofman, A., Tuerlinckx, F., & Partchev, I. (2011). The estimation of item response models with the lmer function from lme4 package in R. Journal of Statistical Software, 39(12), 1-28. https://doi.org/10.18637/jss.v039.i12
Details
Primary Language
English
Subjects
Measurement Theories and Applications in Education and Psychology
Journal Section
Research Article
Authors
Early Pub Date
July 21, 2025
Publication Date
September 4, 2025
Submission Date
August 14, 2024
Acceptance Date
June 4, 2025
Published in Issue
Year 2025 Volume: 12 Number: 3