Research Article

Disengagement matters: A response-time informed approach to scoring low-stakes assessments

Volume: 13 Number: 1 January 2, 2026
EN TR

Disengagement matters: A response-time informed approach to scoring low-stakes assessments

Abstract

Low-stakes assessments in K-12 education play a crucial role in monitoring student progress, yet their validity is often compromised by disengaged test-taking behaviors, such as rapid guessing and idling. This study introduces a novel application of the Nested Logit Model (NLM) to integrate test-taking engagement, as indicated by response times (RTs), with item responses to improve ability estimation. Using data from a low-stakes reading assessment in the United States (n = 27,556 students in grades 5 to 8), we compared six scoring approaches, including traditional dichotomous scoring, effort-moderated scoring, and nominal scoring that categorized responses based on RT-informed engagement. Our results demonstrated that nominal scoring approaches, particularly those distinguishing rapid guesses and idle responses, yielded superior model fit, increased measurement precision, and provided nuanced insights into examinee behaviors compared to dichotomous scoring methods. Latent class analysis further identified three distinct engagement profiles—effortful responders, rapid guessers, and idle responders—highlighting the need to address both rapid and idle behaviors in modeling. This study emphasizes the value of leveraging RT data to enhance the accuracy of low-stakes assessments while preserving response information. Findings also suggest the NLM framework as a practical and accessible tool for researchers and practitioners seeking to address disengaged behaviors and ensure the reliability of low-stakes assessments.

Keywords

Ethical Statement

University of Alberta, Pro00154093.

References

  1. Aboutaleb, Y.M., Ben-Akiva, M., & Jaillet, P. (2020). Learning structure in nested logit models. ArXiv. https://doi.org/10.48550/arXiv.2008.08048
  2. Ben-Akiva, M.E. (1973). Structure of passenger travel demand models [Doctoral thesis, Massachusetts Institute of Technology]. DSpace@MIT. https://dspace.mit.edu/handle/1721.1/14790
  3. Ben-Akiva, M.E., & Lerman, S.R. (1985). Discrete choice analysis: Theory and application to travel demand. Cambridge, MA: MIT Press.
  4. Bock, R.D. (1972). Estimating item parameters and latent ability when responses are scored in two or more nominal categories. Psychometrika, 37, 29–51.
  5. Bolt, D.M., Wollack, J.A., & Suh, Y. (2012). Application of a multidimensional nested logit model to multiple-choice test items. Psychometrika, 77(2), 339–357. https://doi.org/10.1007/S11336-012-9257-5
  6. Bulut, O., Gorgun, G., Wongvorachan, T., & Tan, B. (2023). Rapid guessing in low-stakes assessments: Finding the optimal response time threshold with random search and genetic algorithm. Algorithms, 16(2). https://doi.org/10.3390/a16020089
  7. Butler, J., & Adams, R.J. (2007). The impact of differential investment of student effort on the outcomes of international studies. Journal of Applied Measurement, 8(3), 279–304.
  8. Chalmers, R.P. (2012). mirt: A multidimensional item response theory package for the R environment. Journal of Statistical Software, 48(6), 1–29. https://doi.org/10.18637/jss.v048.i06

Details

Primary Language

English

Subjects

Measurement Theories and Applications in Education and Psychology

Journal Section

Research Article

Publication Date

January 2, 2026

Submission Date

June 16, 2025

Acceptance Date

November 13, 2025

Published in Issue

Year 2026 Volume: 13 Number: 1

APA
Bulut, O., Liu, J. X., & Bulut, H. C. (2026). Disengagement matters: A response-time informed approach to scoring low-stakes assessments. International Journal of Assessment Tools in Education, 13(1), 123-144. https://doi.org/10.21449/ijate.1721066
AMA
1.Bulut O, Liu JX, Bulut HC. Disengagement matters: A response-time informed approach to scoring low-stakes assessments. Int. J. Assess. Tools Educ. 2026;13(1):123-144. doi:10.21449/ijate.1721066
Chicago
Bulut, Okan, Joyce Xinle Liu, and Hatice Cigdem Bulut. 2026. “Disengagement Matters: A Response-Time Informed Approach to Scoring Low-Stakes Assessments”. International Journal of Assessment Tools in Education 13 (1): 123-44. https://doi.org/10.21449/ijate.1721066.
EndNote
Bulut O, Liu JX, Bulut HC (January 1, 2026) Disengagement matters: A response-time informed approach to scoring low-stakes assessments. International Journal of Assessment Tools in Education 13 1 123–144.
IEEE
[1]O. Bulut, J. X. Liu, and H. C. Bulut, “Disengagement matters: A response-time informed approach to scoring low-stakes assessments”, Int. J. Assess. Tools Educ., vol. 13, no. 1, pp. 123–144, Jan. 2026, doi: 10.21449/ijate.1721066.
ISNAD
Bulut, Okan - Liu, Joyce Xinle - Bulut, Hatice Cigdem. “Disengagement Matters: A Response-Time Informed Approach to Scoring Low-Stakes Assessments”. International Journal of Assessment Tools in Education 13/1 (January 1, 2026): 123-144. https://doi.org/10.21449/ijate.1721066.
JAMA
1.Bulut O, Liu JX, Bulut HC. Disengagement matters: A response-time informed approach to scoring low-stakes assessments. Int. J. Assess. Tools Educ. 2026;13:123–144.
MLA
Bulut, Okan, et al. “Disengagement Matters: A Response-Time Informed Approach to Scoring Low-Stakes Assessments”. International Journal of Assessment Tools in Education, vol. 13, no. 1, Jan. 2026, pp. 123-44, doi:10.21449/ijate.1721066.
Vancouver
1.Okan Bulut, Joyce Xinle Liu, Hatice Cigdem Bulut. Disengagement matters: A response-time informed approach to scoring low-stakes assessments. Int. J. Assess. Tools Educ. 2026 Jan. 1;13(1):123-44. doi:10.21449/ijate.1721066

23823             23825             23824