Research Article

The study of the effect of item parameter drift on ability estimation obtained from adaptive testing under different conditions

Volume: 9 Number: 3 September 30, 2022
TR EN

The study of the effect of item parameter drift on ability estimation obtained from adaptive testing under different conditions

Abstract

Item parameter drift (IPD) is the systematic differentiation of parameter values of items over time due to various reasons. If it occurs in computer adaptive tests (CAT), it causes errors in the estimation of item and ability parameters. Identification of the underlying conditions of this situation in CAT is important for estimating item and ability parameters with minimum error. This study examines the measurement precision of IPD and its impacts on the test information function (TIF) in CAT administrations. This simulation study compares sample size (1000, 5000), IPD size (0.00 logit, 0.50 logit, 0.75 logit, 1.00 logit), percentage of items containing IPD (0%, 5%, 10%, 20%), three time points and item bank size (200, 500, 1000) conditions. To examine the impacts of the conditions on ability estimations; measurement precision, and TIF values were calculated, and factorial analysis of variance (ANOVA) for independent samples was carried out to examine whether there were any differences between estimations in terms of these factors. The study found that an increase in the number of measurements using item bank with IPD items results in a decrease in measurement precision and the amount of information the test provides. Factorial ANOVA for independent samples revealed that measurements precision and TIF differences are mostly statistically significant. Although all IPD conditions negatively affect measurement precision and TIF, it has been shown that sample size and item bank size generally do not have an increasing or decreasing effect on these factors.

Keywords

References

  1. Abad, F.J., Olea, J., Aguado, D., Ponsoda, V., & Barrada, J.R. (2010). Deterioro de parámetros de los ítems en tests adaptativos informatizados: estudio con eCAT [Item parameter drift in computerized adaptive testing: Study with eCAT]. Psicothema, 22, 340-7.
  2. Aksu Dünya, B. (2017). Item parameter drift in computer adaptive testing due to lack of content knowledge within sub-populations [Doctoral dissertation, University of Illinois].
  3. Babcock, B., & Albano, A.D. (2012). Rasch scale stability in the presence of item parameter and trait drift. Applied Psychological Measurement, 36(7), 565 580. https://doi.org/10.1177/0146621612455090
  4. Babcock, B., & Weiss, D.J. (2012). Termination criteria in computerized adaptive test do variable-length CAT’s provide efficient and effective measurement? International Association for Computerized Adaptive Testing, 1, 1 18. http://dx.doi.org/10.7333%2Fjcat.v1i1.16
  5. Barrada, J.R., Olea, J., Ponsoda, V., & Abad, F.J. (2010). A method for the comparison of item selection rules in computerized adaptive testing. Applied Psychological Measurement, 34, 438-452. https://doi.org/10.1177/0146621610370152
  6. Bergstrom, B.A., Stahl, J., & Netzky, B.A. (2001, April). Factors that influence parameter drift [Conference presentation] American Educational Research Association, Seattle, WA.
  7. Blais, J. & Raiche, G. (2002, April). Features of the sampling distribution of the ability estimate in computerized adaptive testing according to two stopping rules, International Objective Measurement Workshop, New Orleans.
  8. Bock, D.B., Muraki, E., & Pfeiffenberger, W. (1988). Item pool maintenance in the presence of item parameter drift. Journal of Educational Measurement, 25(4), 275-285. https://doi.org/10.1111/j.1745-3984.1988.tb00308.x

Details

Primary Language

English

Subjects

Other Fields of Education

Journal Section

Research Article

Publication Date

September 30, 2022

Submission Date

February 9, 2022

Acceptance Date

August 8, 2022

Published in Issue

Year 2022 Volume: 9 Number: 3

APA
Şahin Kürşad, M., Çokluk-bökeoglu, Ö., & Çıkrıkçı, N. (2022). The study of the effect of item parameter drift on ability estimation obtained from adaptive testing under different conditions. International Journal of Assessment Tools in Education, 9(3), 654-681. https://doi.org/10.21449/ijate.1070848

Cited By

23823             23825             23824