EN
The Impact of Item Position on Item Parameters: A Multi-Method Approach
Abstract
Multiple-choice tests are widely favored in large scale assessments and classroom evaluations due to their practicality and efficiency. Their ease of administration and scoring makes multiple-choice tests highly practical, but they also come with disadvantages, such as the potential for cheating and the possibility of guessing the correct answer. To reduce the risk of cheating, it is common practice to administer multiple versions of a test with items arranged in different orders. However, such variations can affect student performance across different test forms. For instance, students who begin with the most difficult items may feel discouraged at the start of the test. Additionally, the psychometric properties of test items may vary across different test forms where the same items are arranged in different orders. This study aims to examine the impact of this phenomenon on the estimation of test item psychometric properties. The study sample consists of 8TH grade students in Türkiye who took a national exam for high school admission. Data from three different subtests in four different booklets of this exam were analyzed. Item parameters were estimated using both classical test and item response theory and were compared across different test forms using the same items. Furthermore, the varying item positions were incorporated into a explanatory item response theory and structural equation model as a explanatory variable. The findings indicate that the psychometric properties of test items can differ significantly depending on their order within the test, highlighting the importance of considering item order effects in test design and interpretation.
Keywords
Ethical Statement
The data is a pre-existing dataset, and permission for its use has been obtained. The necessary approval was granted by the Ministry of National Education in 2016 as part of a master's thesis supervised by one of the authors.
References
- Albano, A. D. (2013). Multilevel modeling of item position effects. Journal of Educational Measurement, 50(4), 408–426. https://doi.org/10.1111/jedm.12026
- Bolt, D. M., Deng, S., & Lee, S. (2014). IRT model misspecification and measurement of growth in vertical scaling. Journal of Educational Measurement, 51(2), 141–162. https://doi.org/10.1111/jedm.12039
- Brown, T. A. (2006). Confirmatory factor analysis for applied research. New York: Guilford.
- Bulut, O., Lei, M., & Guo, Q. (2016). Item and testlet position effects in computer-based alternate assessments for students with disabilities. International Journal of Research & Method in Education, 41(2), 169–183. https://doi.org/10.1080/1743727x.2016.1262341
- Bulut, O., Quo, Q. & Gierl, M.J. A structural equation modeling approach for examining position effects in large-scale assessments. Large-scale Assess Educ 5, 8 (2017). https://doi.org/10.1186/s40536-017-0042-x
- Camara, W. J., & Echternacht, G. (2000). The SAT I and high school grades: Utility in predicting success in college (Research Notes RN-10). The College Board, Office of Research and Development. http://research.collegeboard.org/sites/default/files/publications/2012
- Davey, T., & Lee, Y. H. (2011). Potential impact of context effects on the scoring and equating of the multistage GRE revised general test (Research Report 11–26). Princeton, NJ: Educational Testing Service.
- Debeer, D., & Janssen, R. (2013). Modeling item‐position effects within an IRT framework. Journal of Educational Measurement, 50(2), 164–185. https://doi.org/10.1111/jedm.12009
Details
Primary Language
English
Subjects
Measurement Equivalence
Journal Section
Research Article
Publication Date
June 30, 2025
Submission Date
March 19, 2025
Acceptance Date
April 30, 2025
Published in Issue
Year 2025 Volume: 16 Number: 2
APA
Atalay Kabasakal, K., & Doğan, N. (2025). The Impact of Item Position on Item Parameters: A Multi-Method Approach. Journal of Measurement and Evaluation in Education and Psychology, 16(2), 71-87. https://doi.org/10.21031/epod.1661135
AMA
1.Atalay Kabasakal K, Doğan N. The Impact of Item Position on Item Parameters: A Multi-Method Approach. JMEEP. 2025;16(2):71-87. doi:10.21031/epod.1661135
Chicago
Atalay Kabasakal, Kübra, and Nuri Doğan. 2025. “The Impact of Item Position on Item Parameters: A Multi-Method Approach”. Journal of Measurement and Evaluation in Education and Psychology 16 (2): 71-87. https://doi.org/10.21031/epod.1661135.
EndNote
Atalay Kabasakal K, Doğan N (June 1, 2025) The Impact of Item Position on Item Parameters: A Multi-Method Approach. Journal of Measurement and Evaluation in Education and Psychology 16 2 71–87.
IEEE
[1]K. Atalay Kabasakal and N. Doğan, “The Impact of Item Position on Item Parameters: A Multi-Method Approach”, JMEEP, vol. 16, no. 2, pp. 71–87, June 2025, doi: 10.21031/epod.1661135.
ISNAD
Atalay Kabasakal, Kübra - Doğan, Nuri. “The Impact of Item Position on Item Parameters: A Multi-Method Approach”. Journal of Measurement and Evaluation in Education and Psychology 16/2 (June 1, 2025): 71-87. https://doi.org/10.21031/epod.1661135.
JAMA
1.Atalay Kabasakal K, Doğan N. The Impact of Item Position on Item Parameters: A Multi-Method Approach. JMEEP. 2025;16:71–87.
MLA
Atalay Kabasakal, Kübra, and Nuri Doğan. “The Impact of Item Position on Item Parameters: A Multi-Method Approach”. Journal of Measurement and Evaluation in Education and Psychology, vol. 16, no. 2, June 2025, pp. 71-87, doi:10.21031/epod.1661135.
Vancouver
1.Kübra Atalay Kabasakal, Nuri Doğan. The Impact of Item Position on Item Parameters: A Multi-Method Approach. JMEEP. 2025 Jun. 1;16(2):71-87. doi:10.21031/epod.1661135