Research Article
BibTex RIS Cite

Year 2025, Volume: 12 Issue: 3, 781 - 786, 04.09.2025
https://doi.org/10.21449/ijate.1532862

Abstract

References

  • Bulut O. (2021). eirm: Explanatory item response modeling for dichotomous and polytomous item responses [Computer software]. Available from https://github.com/okanbulut/eirm
  • Bulut, O., Gorgun, G., & Yildirim-Erbasli, S. (2021). Estimating explanatory extensions of dichotomous and polytomous Rasch models: The eirm package in R. Psych, 3(3), 308-321. https://doi.org/10.3390/psych3030023
  • Bulut, O., Quo, Q., & Gierl, M.J. (2017). A structural equation modeling approach for examining position effects. Large Scale Assessments in Education, 5(8), 2 20. https://doi.org/10.1186/s40536-017-0042-x
  • Christiansen, A., & Janssen, R. (2021). Item position effects in listening but not in reading in the European survey of language competences. Educational Assessment, Evaluation and Accountability, 33, 49-69. https://doi.org/10.1007/s11092-020-09335-7
  • Cotter, K.E., Centurino, V.A.S., & Mullis, I.V.S. (2020). Developing the TIMSS 2019 mathematics and science achievement instruments. In M.O. Martin, M. von Davier, & I.V.S. Mullis (Eds.), Methods and Procedures: TIMSS 2019 Technical Report (pp. 1.1 1.36). Boston College. https://timssandpirls.bc.edu/timss2019/methods/chapter-1.html
  • Debeer, D., & Janssen, R. (2013). Modeling item-position effects within an IRT framework. Journal of Educational Measurement, 50(2), 164 185. https://www.jstor.org/stable/24018105
  • Debeer, D., Buchholz, J., & Hartig, J. (2014). Student, school, and country differences in sustained test-taking effort in the 2009 PISA reading assessment. Journal of Educational and Behavioral Statistics, 39(6), 502-523. https://doi.org/10.3102/1076998614558485
  • De Boeck, P., Bakker, M., Zwitser, R., Nivard, M., Hofman, A., Tuerlinckx, F., & Partchev, I. (2011). The estimation of item response models with the lmer function from lme4 package in R. Journal of Statistical Software, 39(12), 1-28. https://doi.org/10.18637/jss.v039.i12
  • Demirkol, S., & Kelecioğlu, H. (2022). Investigating the effect of item position on person and item parameters: PISA 2015 Turkey sample. Journal of Measurement and Evaluation and Psychology, 13(1), 69-85. https://doi.org/10.21031/epod.958576
  • Demir, E. (2010). The students achievement in Turkey, according to the question types used in program for international student assessment (PISA) cognitive domain tests [Unpublished master’s thesis]. Hacettepe University.
  • Fishbein, B., Foy, P., & Tyack, L. (2020). Reviewing the TIMSS 2019 achievement item statistics. In M. O. Martin, M. von Davier, & I.V.S. Mullis (Eds.), Methods and Procedures: TIMSS 2019 Technical Report (pp. 10.1 10.70). Boston College. https://timssandpirls.bc.edu/timss2019/methods/chapter-10.html
  • Frey, A., Hartig, J., & Rupp, A.A. (2009). An NCME instructional module on booklet designs in large-scale assessments of student achievement: Theory and practice. Educational Measurement: Issues and Practice, 28(3), 39 53. https://doi.org/10.1111/j.17453992.2009.00154.x
  • Gonzalez, E.J., & Rutkowski, L. (2010). Principles of multiple matrix booklet designs and parameter recovery in large-scale assessment. In von Davier, M. & Hastedt, D. (Eds.) IERI Monograph Series: Issues and methodologies in Large – Scale Assessments (Vol. 3, pp. 125-156).
  • Hahne, J. (2008). Analyzing position effects within reasoning items using the LLTM for structurally incomplete data. Psychology Science Quarterly, 50(3), 379 390. https://bit.ly/3aHHyGD
  • Hartig, J., & Buchholz, J. (2012). A multilevel item response model for item position effects and individual persistence. Psychological Test and Assessment Modeling, 54(4), 418-431. https://psycnet.apa.org/record/2013-10658-006
  • Hastedt, D., & Sibberns, H. (2005). Differences between multiple choice items and constructed response items in the IEA TIMSS surveys. Studies in Educational Evaluation, 31(2-3), 145-161. https://doi.org/10.1016/j.stueduc.2005.05.007
  • He, Q., & Wheadon, C. (2013). Using the dichotomous Rasch model to analyze polytomous items. Journal of Applied Measurement, 14(1), 44 56. https://pubmed.ncbi.nlm.nih.gov/23442327/
  • Hecht, M., Weirich, S., Siegle, T., & Frey, A. (2015). Modeling booklet effects for nonequivalent group designs in large-scale assessment. Educational and Psychological Measurement, 75(4), 568-584. https://doi.org/10.1177/0013164414554219
  • Hohensinn, C., Kubinger, K.G., Reif, M., Schleicher, E., & Khorramdel, L. (2011). Analysing item position effects due to test booklet design within large-scale assessment. Educational Research and Evaluation, 17(6), 497-509. https://doi.org/10.1080/13803611.2011.632668
  • Hohensinn, C., Kubinger, K.D., Reif, M., Holocher-Ertl, S., Khorramdel, L., & Frebort, M. (2008). Examining item-position effects in large -scale assessments using linear logistic model. Psychology Science Quarterly, 50(3), 391-402.
  • İlhan, M., Boztunç-Öztürk, N., & Şahin, M.G. (2020). The effect of the item’s type and cognitive level on its difficulty index: The sample of TIMSS 2015. Participatory Educational Research, 7(2), 47-59. https://doi.org/10.17275/per.20.19.7.2
  • Janssen, R., Schepers, J., & Peres, D. (2004). Models with item and item group predictors. P. De Boeck & W. Wilson (Ed.), Explanatory item response models: A generalized linear and nonlinear approach (p. 189-210). New York: Springer.
  • Kelly, D.L., Centurino, V.A.S., Martin, M.O., & Mullis, I.V.S. (Eds.) (2020). TIMSS 2019 Encyclopedia: Education Policy and Curriculum in Mathematics and Science. Boston College, https://timssandpirls.bc.edu/timss2019/encyclopedia/
  • Klosner, N.C., & Gellman, E.K. (1973). The effect of item arrangement on classroom test performance: Implications for content validity. Educational and Psychological Measurement, 33(2), 413-418. https://doi.org/10.1177/001316447303300224
  • Kuang, H., & Sahin, F. (2023). Comparison of disengagement levels and the impact of disengagement on item parameters between PISA 2015 and PISA 2018 in the United States. Large-Scale Assessments in Education, 11(4), 1-31. https://doi.org/10.1186/s40536-023-00152-0
  • Le, L.T. (2007, July). Effects of item positions on their difficulty and discrimination: A study in PISA science data across test language and countries [Conference presentation]. 72nd Annual Meeting of the Psychometric Society, Tokyo, Japan. https://research.acer.edu.au/cgi/viewcontent.cgi?article=1001&context=pisa
  • Leary, L.F., & Dorans, N.J. (1985). Implications for altering the context in which test item appear: A historical perspective on immediate concern. Review of Educational Research, 55(3), 387-413. https://www.jstor.org/stable/1170392
  • Liou, PY., & Bulut, O. (2020). The effects of item format and cognitive domain on students’ science performance in TIMSS 2011. Research in Science Education, 50, 99-121. https://doi.org/10.1007/s11165-017-9682-7
  • Liu, J.X., Bulut, O., & Johnson, M.D. (2024). Examining position effects on students’ ability and test-taking speed in the TIMSS 2019 problem-solving and inquiry tasks: A structural equation modeling approach. Psychology International, 6(2), 492 508. https://doi.org/10.3390/psycholint6020030
  • Ma, Y., & Harris, D.J. (2025). Investigating approaches to controlling item position effects in computerized adaptive tests. Educational Measurement: Issues and Practice, 44(1), 44 54. https://doi.org/10.1111/emip.12637
  • Marcq, K., Donayre, E.J.C., & Braeken, J. (2024). The role of item format in the PISA 2018 mathematics literacy assessment: A cross-country study. Studies in Educational Evaluation, 83, 1-14. https://doi.org/10.1016/j.stueduc.2024.101401
  • Martin, O.M., Mullis, I.V.S., & Foy, P. (2017). TIMSS 2019 Assessment Design. In Mullis, I.V.S., & Martin, M.O. (Eds.), TIMSS 2019 Assessment Frameworks (pp.80-91). Boston College. http://timssandpirls.bc.edu/timss2019/frameworks/
  • Martinez, E.M. (1991). A comparison of multiple -choice and constructed figural response items. Journal of Educational Measurement, 28(2), 131-145. https://doi.org/10.1111/j.1745-3984.1991.tb00349.x
  • Masters, G.N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47(2), 149-174. https://doi.org/10.1007/BF02296272
  • Meyers, J.L., Miller, G.E., & Way, W.D. (2008). Item position and item difficulty change in an IRT-based common item equating design. Applied Measurement in Education, 22(1), 38-60. https://doi.org/10.1080/08957340802558342
  • Mullis, I.V.S., & Martin, M.O. (Eds.). (2017). TIMSS 2019 Assessment Frameworks. Boston College. http://timssandpirls.bc.edu/timss2019/frameworks/
  • Mullis, I.V.S., Martin, M.O., Fishbein, B., Foy, P., & Moncaleano, S. (2021). Findings from the TIMSS 2019 Problem Solving and Inquiry Tasks. Boston College. https://timssandpirls.bc.edu/timss2019/psi/
  • Mullis, I.V.S., Martin, M.O., Foy, P., Kelly, D.L., & Fishbein, B. (2020). TIMSS 2019 International results in mathematics and science. Boston College. https://timssandpirls.bc.edu/timss2019/
  • Nagy, G., Nagengast, B., Frey, A., Becker, M., & Rose, N. (2018). A multilevel study of position effects in PISA achievement tests: student- and school-level predictors in the German tracked school system. Assessment in Education: Principles, Policy & Practice, 26(4), 422-443. https://doi.org/10.1080/0969594X.2018.1449100
  • Ong, T.Q., & Pastor, D.A. (2022). Uncovering the complexity of item position effects in a low-stakes testing context. Applied Psychological Measurement, 46(7), 571 588. https://doi.org/10.1177/01466216221108134
  • Özer Özkan, Y., & Özaslan, N. (2018). Student achievement in Turkey, according to question types used in PISA 2003-2012 mathematic literacy tests. International Journal of Evaluation and Research in Education, 7(1), 57-64. http://doi.org/10.11591/ijere.v7i1.11045
  • Pools, E. (2022). Not reached items: An issue of time and of test taking disengagement? The case of PISA 2015 reading data. Applied Measurement in Education, 35(3), 197-221. https://doi.org/10.1080/08957347.2022.2103136
  • Qian, J. (2014). An investigation of position effects in large-scale writing assessments. Applied Psychological Measurement, 38(7), 518-534. https://doi.org/10.1177/01466216145343
  • Robinson, P. (1993). The politics of multiple-choice versus free-response assessment. In R.E. Bennett & W.C. Ward (Eds.), Construction versus choice in cognitive measurement (pp. 313-323). Routledge.
  • Sideridis, G., Hamed, H., & Jaffari, F. (2023). The item position effects in international examinations: the roles of gender. Frontiers Psychology, 14, 1 10. https://doi.org/10.3389/fpsyg.2023.1220384
  • Tyack, L., Fishbein, B., Bristol, J., Mao, T., & Gonzalez, G. (2024). Reviewing the TIMSS achievement data. In M. von Davier, B. Fishbein, & A. Kennedy (Eds.), TIMSS 2023 Technical Report (Methods and Procedures) (pp. 10.1-10.17). Boston College. https://doi.org/10.6017/lse.tpisc.timss.rs7695
  • Weirich, S., Hecht, M., & Böhme, K. (2014). Modeling item position effects using generalized linear mixed models. Applied Psychological Measurement, 38(7), 535 548. https://doi.org/10.1177/0146621614534955
  • Wickham, H. (2007). Reshaping data with the reshape package. Journal of Statistical Software, 21(12), 1–20. http://www.jstatsoft.org/v21/i12/
  • Wickham, H., Averick, M., Bryan, J., Chang, W., McGowan, L.D., François, R., Grolemund, G., Hayes A., Henry, L., Hester, J., Kuhn, M., Pedersen T.L., Miller, E., Bache, S.M., Müller, K., Ooms, J., Robinson, D., Seidel, D.P., Spinu, V., Takahashi, K., Vaughan, D., Wilke, C., Woo, K., & Yutani, H. (2019). Welcome to the tidyverse. Journal of Open Source Software, 4(43), 1-6. https://doi.org/10.21105/joss.01686
  • Wilson, M., & De Boeck, P. (2004). Descriptive and explanatory item response models. In P. De Boeck, & P. Wilson (Eds.), Explanatory Item Response Models: A generalized linear and nonlinear approach (pp. 43-74). Newyork: Springer.
  • Wilson, M., De Boeck, P., & Carstensen, C.H. (2008). Explanatory item response models: A brief introduction. In J. Hartig, E. Klieme, & D. Leutner (Eds.), Assessment of competencies in educational contexts (pp. 91-120). Hogrefe & Huber Publishers.
  • Wise, L.L., Chia, W., & Park, R. (1989, March). Item position effects for test of word knowledge and arithmetic reasoning [Conference presentation]. The Annual Meeting of the American Educational Research Association, San Francisco, CA.
  • Wu, Q., Debeer, D., Buchholz, J., Hartig, J., & Janssen, R. (2019). Predictors of individual performance changes related to item positions in PISA assessments. Large-Scale Assessments in Education, 7(5). https://doi.org/10.1186/s40536-019-0073-6
  • Yoo, N. (2020). Item position and motivation effects in large -scale assessments [Unpublished doctoral dissertation]. Columbia University.
  • Zheng, X., Sahin, F., Erberber, E., & Fonseca, F. (2023). Identification and cross-country comparison of students’ test-taking behaviors in selected eTIMSS 2019 countries. Large-scale Assessments in Education, 11(32). https://doi.org/10.1186/s40536-023-00179-3

Item block position and format effects in e-TIMSS among the low- and high-achieving countries

Year 2025, Volume: 12 Issue: 3, 781 - 786, 04.09.2025
https://doi.org/10.21449/ijate.1532862

Abstract

The Trends in International Mathematics and Science Study (TIMSS) was administered via computer, eTIMSS, for the first time in 2019. The purpose of this study was to investigate item block position and item format effect on eighth grade mathematics item easiness in low- and high-achieving countries of eTIMSS 2019. Item responses from Chile, Qatar, and Malaysia which were low-achieving countries as well as Republic of Korea, Chinese Taipei, and Singapore which were high-achieving countries, were used in the study. The block position and item format effects were investigated within explanatory item response theory framework. The results revealed that there was a negative and statistically significant item block position effect in all low-and high-achieving countries, and it is more prominent in the low-achieving countries. As item block increased, students’ probability of giving a correct response to an item decreased. Additionally, the results showed that all high- and low-achieving countries had a negative and significant item format effect in that multiple-choice items appeared easier compared to constructed response items.

Ethical Statement

The data is open access so no need Ethics Committee Approval Letter

References

  • Bulut O. (2021). eirm: Explanatory item response modeling for dichotomous and polytomous item responses [Computer software]. Available from https://github.com/okanbulut/eirm
  • Bulut, O., Gorgun, G., & Yildirim-Erbasli, S. (2021). Estimating explanatory extensions of dichotomous and polytomous Rasch models: The eirm package in R. Psych, 3(3), 308-321. https://doi.org/10.3390/psych3030023
  • Bulut, O., Quo, Q., & Gierl, M.J. (2017). A structural equation modeling approach for examining position effects. Large Scale Assessments in Education, 5(8), 2 20. https://doi.org/10.1186/s40536-017-0042-x
  • Christiansen, A., & Janssen, R. (2021). Item position effects in listening but not in reading in the European survey of language competences. Educational Assessment, Evaluation and Accountability, 33, 49-69. https://doi.org/10.1007/s11092-020-09335-7
  • Cotter, K.E., Centurino, V.A.S., & Mullis, I.V.S. (2020). Developing the TIMSS 2019 mathematics and science achievement instruments. In M.O. Martin, M. von Davier, & I.V.S. Mullis (Eds.), Methods and Procedures: TIMSS 2019 Technical Report (pp. 1.1 1.36). Boston College. https://timssandpirls.bc.edu/timss2019/methods/chapter-1.html
  • Debeer, D., & Janssen, R. (2013). Modeling item-position effects within an IRT framework. Journal of Educational Measurement, 50(2), 164 185. https://www.jstor.org/stable/24018105
  • Debeer, D., Buchholz, J., & Hartig, J. (2014). Student, school, and country differences in sustained test-taking effort in the 2009 PISA reading assessment. Journal of Educational and Behavioral Statistics, 39(6), 502-523. https://doi.org/10.3102/1076998614558485
  • De Boeck, P., Bakker, M., Zwitser, R., Nivard, M., Hofman, A., Tuerlinckx, F., & Partchev, I. (2011). The estimation of item response models with the lmer function from lme4 package in R. Journal of Statistical Software, 39(12), 1-28. https://doi.org/10.18637/jss.v039.i12
  • Demirkol, S., & Kelecioğlu, H. (2022). Investigating the effect of item position on person and item parameters: PISA 2015 Turkey sample. Journal of Measurement and Evaluation and Psychology, 13(1), 69-85. https://doi.org/10.21031/epod.958576
  • Demir, E. (2010). The students achievement in Turkey, according to the question types used in program for international student assessment (PISA) cognitive domain tests [Unpublished master’s thesis]. Hacettepe University.
  • Fishbein, B., Foy, P., & Tyack, L. (2020). Reviewing the TIMSS 2019 achievement item statistics. In M. O. Martin, M. von Davier, & I.V.S. Mullis (Eds.), Methods and Procedures: TIMSS 2019 Technical Report (pp. 10.1 10.70). Boston College. https://timssandpirls.bc.edu/timss2019/methods/chapter-10.html
  • Frey, A., Hartig, J., & Rupp, A.A. (2009). An NCME instructional module on booklet designs in large-scale assessments of student achievement: Theory and practice. Educational Measurement: Issues and Practice, 28(3), 39 53. https://doi.org/10.1111/j.17453992.2009.00154.x
  • Gonzalez, E.J., & Rutkowski, L. (2010). Principles of multiple matrix booklet designs and parameter recovery in large-scale assessment. In von Davier, M. & Hastedt, D. (Eds.) IERI Monograph Series: Issues and methodologies in Large – Scale Assessments (Vol. 3, pp. 125-156).
  • Hahne, J. (2008). Analyzing position effects within reasoning items using the LLTM for structurally incomplete data. Psychology Science Quarterly, 50(3), 379 390. https://bit.ly/3aHHyGD
  • Hartig, J., & Buchholz, J. (2012). A multilevel item response model for item position effects and individual persistence. Psychological Test and Assessment Modeling, 54(4), 418-431. https://psycnet.apa.org/record/2013-10658-006
  • Hastedt, D., & Sibberns, H. (2005). Differences between multiple choice items and constructed response items in the IEA TIMSS surveys. Studies in Educational Evaluation, 31(2-3), 145-161. https://doi.org/10.1016/j.stueduc.2005.05.007
  • He, Q., & Wheadon, C. (2013). Using the dichotomous Rasch model to analyze polytomous items. Journal of Applied Measurement, 14(1), 44 56. https://pubmed.ncbi.nlm.nih.gov/23442327/
  • Hecht, M., Weirich, S., Siegle, T., & Frey, A. (2015). Modeling booklet effects for nonequivalent group designs in large-scale assessment. Educational and Psychological Measurement, 75(4), 568-584. https://doi.org/10.1177/0013164414554219
  • Hohensinn, C., Kubinger, K.G., Reif, M., Schleicher, E., & Khorramdel, L. (2011). Analysing item position effects due to test booklet design within large-scale assessment. Educational Research and Evaluation, 17(6), 497-509. https://doi.org/10.1080/13803611.2011.632668
  • Hohensinn, C., Kubinger, K.D., Reif, M., Holocher-Ertl, S., Khorramdel, L., & Frebort, M. (2008). Examining item-position effects in large -scale assessments using linear logistic model. Psychology Science Quarterly, 50(3), 391-402.
  • İlhan, M., Boztunç-Öztürk, N., & Şahin, M.G. (2020). The effect of the item’s type and cognitive level on its difficulty index: The sample of TIMSS 2015. Participatory Educational Research, 7(2), 47-59. https://doi.org/10.17275/per.20.19.7.2
  • Janssen, R., Schepers, J., & Peres, D. (2004). Models with item and item group predictors. P. De Boeck & W. Wilson (Ed.), Explanatory item response models: A generalized linear and nonlinear approach (p. 189-210). New York: Springer.
  • Kelly, D.L., Centurino, V.A.S., Martin, M.O., & Mullis, I.V.S. (Eds.) (2020). TIMSS 2019 Encyclopedia: Education Policy and Curriculum in Mathematics and Science. Boston College, https://timssandpirls.bc.edu/timss2019/encyclopedia/
  • Klosner, N.C., & Gellman, E.K. (1973). The effect of item arrangement on classroom test performance: Implications for content validity. Educational and Psychological Measurement, 33(2), 413-418. https://doi.org/10.1177/001316447303300224
  • Kuang, H., & Sahin, F. (2023). Comparison of disengagement levels and the impact of disengagement on item parameters between PISA 2015 and PISA 2018 in the United States. Large-Scale Assessments in Education, 11(4), 1-31. https://doi.org/10.1186/s40536-023-00152-0
  • Le, L.T. (2007, July). Effects of item positions on their difficulty and discrimination: A study in PISA science data across test language and countries [Conference presentation]. 72nd Annual Meeting of the Psychometric Society, Tokyo, Japan. https://research.acer.edu.au/cgi/viewcontent.cgi?article=1001&context=pisa
  • Leary, L.F., & Dorans, N.J. (1985). Implications for altering the context in which test item appear: A historical perspective on immediate concern. Review of Educational Research, 55(3), 387-413. https://www.jstor.org/stable/1170392
  • Liou, PY., & Bulut, O. (2020). The effects of item format and cognitive domain on students’ science performance in TIMSS 2011. Research in Science Education, 50, 99-121. https://doi.org/10.1007/s11165-017-9682-7
  • Liu, J.X., Bulut, O., & Johnson, M.D. (2024). Examining position effects on students’ ability and test-taking speed in the TIMSS 2019 problem-solving and inquiry tasks: A structural equation modeling approach. Psychology International, 6(2), 492 508. https://doi.org/10.3390/psycholint6020030
  • Ma, Y., & Harris, D.J. (2025). Investigating approaches to controlling item position effects in computerized adaptive tests. Educational Measurement: Issues and Practice, 44(1), 44 54. https://doi.org/10.1111/emip.12637
  • Marcq, K., Donayre, E.J.C., & Braeken, J. (2024). The role of item format in the PISA 2018 mathematics literacy assessment: A cross-country study. Studies in Educational Evaluation, 83, 1-14. https://doi.org/10.1016/j.stueduc.2024.101401
  • Martin, O.M., Mullis, I.V.S., & Foy, P. (2017). TIMSS 2019 Assessment Design. In Mullis, I.V.S., & Martin, M.O. (Eds.), TIMSS 2019 Assessment Frameworks (pp.80-91). Boston College. http://timssandpirls.bc.edu/timss2019/frameworks/
  • Martinez, E.M. (1991). A comparison of multiple -choice and constructed figural response items. Journal of Educational Measurement, 28(2), 131-145. https://doi.org/10.1111/j.1745-3984.1991.tb00349.x
  • Masters, G.N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47(2), 149-174. https://doi.org/10.1007/BF02296272
  • Meyers, J.L., Miller, G.E., & Way, W.D. (2008). Item position and item difficulty change in an IRT-based common item equating design. Applied Measurement in Education, 22(1), 38-60. https://doi.org/10.1080/08957340802558342
  • Mullis, I.V.S., & Martin, M.O. (Eds.). (2017). TIMSS 2019 Assessment Frameworks. Boston College. http://timssandpirls.bc.edu/timss2019/frameworks/
  • Mullis, I.V.S., Martin, M.O., Fishbein, B., Foy, P., & Moncaleano, S. (2021). Findings from the TIMSS 2019 Problem Solving and Inquiry Tasks. Boston College. https://timssandpirls.bc.edu/timss2019/psi/
  • Mullis, I.V.S., Martin, M.O., Foy, P., Kelly, D.L., & Fishbein, B. (2020). TIMSS 2019 International results in mathematics and science. Boston College. https://timssandpirls.bc.edu/timss2019/
  • Nagy, G., Nagengast, B., Frey, A., Becker, M., & Rose, N. (2018). A multilevel study of position effects in PISA achievement tests: student- and school-level predictors in the German tracked school system. Assessment in Education: Principles, Policy & Practice, 26(4), 422-443. https://doi.org/10.1080/0969594X.2018.1449100
  • Ong, T.Q., & Pastor, D.A. (2022). Uncovering the complexity of item position effects in a low-stakes testing context. Applied Psychological Measurement, 46(7), 571 588. https://doi.org/10.1177/01466216221108134
  • Özer Özkan, Y., & Özaslan, N. (2018). Student achievement in Turkey, according to question types used in PISA 2003-2012 mathematic literacy tests. International Journal of Evaluation and Research in Education, 7(1), 57-64. http://doi.org/10.11591/ijere.v7i1.11045
  • Pools, E. (2022). Not reached items: An issue of time and of test taking disengagement? The case of PISA 2015 reading data. Applied Measurement in Education, 35(3), 197-221. https://doi.org/10.1080/08957347.2022.2103136
  • Qian, J. (2014). An investigation of position effects in large-scale writing assessments. Applied Psychological Measurement, 38(7), 518-534. https://doi.org/10.1177/01466216145343
  • Robinson, P. (1993). The politics of multiple-choice versus free-response assessment. In R.E. Bennett & W.C. Ward (Eds.), Construction versus choice in cognitive measurement (pp. 313-323). Routledge.
  • Sideridis, G., Hamed, H., & Jaffari, F. (2023). The item position effects in international examinations: the roles of gender. Frontiers Psychology, 14, 1 10. https://doi.org/10.3389/fpsyg.2023.1220384
  • Tyack, L., Fishbein, B., Bristol, J., Mao, T., & Gonzalez, G. (2024). Reviewing the TIMSS achievement data. In M. von Davier, B. Fishbein, & A. Kennedy (Eds.), TIMSS 2023 Technical Report (Methods and Procedures) (pp. 10.1-10.17). Boston College. https://doi.org/10.6017/lse.tpisc.timss.rs7695
  • Weirich, S., Hecht, M., & Böhme, K. (2014). Modeling item position effects using generalized linear mixed models. Applied Psychological Measurement, 38(7), 535 548. https://doi.org/10.1177/0146621614534955
  • Wickham, H. (2007). Reshaping data with the reshape package. Journal of Statistical Software, 21(12), 1–20. http://www.jstatsoft.org/v21/i12/
  • Wickham, H., Averick, M., Bryan, J., Chang, W., McGowan, L.D., François, R., Grolemund, G., Hayes A., Henry, L., Hester, J., Kuhn, M., Pedersen T.L., Miller, E., Bache, S.M., Müller, K., Ooms, J., Robinson, D., Seidel, D.P., Spinu, V., Takahashi, K., Vaughan, D., Wilke, C., Woo, K., & Yutani, H. (2019). Welcome to the tidyverse. Journal of Open Source Software, 4(43), 1-6. https://doi.org/10.21105/joss.01686
  • Wilson, M., & De Boeck, P. (2004). Descriptive and explanatory item response models. In P. De Boeck, & P. Wilson (Eds.), Explanatory Item Response Models: A generalized linear and nonlinear approach (pp. 43-74). Newyork: Springer.
  • Wilson, M., De Boeck, P., & Carstensen, C.H. (2008). Explanatory item response models: A brief introduction. In J. Hartig, E. Klieme, & D. Leutner (Eds.), Assessment of competencies in educational contexts (pp. 91-120). Hogrefe & Huber Publishers.
  • Wise, L.L., Chia, W., & Park, R. (1989, March). Item position effects for test of word knowledge and arithmetic reasoning [Conference presentation]. The Annual Meeting of the American Educational Research Association, San Francisco, CA.
  • Wu, Q., Debeer, D., Buchholz, J., Hartig, J., & Janssen, R. (2019). Predictors of individual performance changes related to item positions in PISA assessments. Large-Scale Assessments in Education, 7(5). https://doi.org/10.1186/s40536-019-0073-6
  • Yoo, N. (2020). Item position and motivation effects in large -scale assessments [Unpublished doctoral dissertation]. Columbia University.
  • Zheng, X., Sahin, F., Erberber, E., & Fonseca, F. (2023). Identification and cross-country comparison of students’ test-taking behaviors in selected eTIMSS 2019 countries. Large-scale Assessments in Education, 11(32). https://doi.org/10.1186/s40536-023-00179-3
There are 55 citations in total.

Details

Primary Language English
Subjects Measurement Theories and Applications in Education and Psychology
Journal Section Research Article
Authors

Neşe Öztürk Gübeş 0000-0003-0179-1986

Submission Date August 14, 2024
Acceptance Date June 4, 2025
Early Pub Date July 21, 2025
Publication Date September 4, 2025
Published in Issue Year 2025 Volume: 12 Issue: 3

Cite

APA Öztürk Gübeş, N. (2025). Item block position and format effects in e-TIMSS among the low- and high-achieving countries. International Journal of Assessment Tools in Education, 12(3), 781-786. https://doi.org/10.21449/ijate.1532862

23823             23825             23824