Explanatory Item Response Models for Polytomous Item Responses
Abstract
Item response theory is a widely used framework for the design, scoring, and scaling of measurement instruments. Item response models are typically used for dichotomously scored questions that have only two score points (e.g., multiple-choice items). However, given the increasing use of instruments that include questions with multiple response categories, such as surveys, questionnaires, and psychological scales, polytomous item response models are becoming more utilized in education and psychology. This study aims to demonstrate the application of explanatory item response models to polytomous item responses in order to explain common variability in item clusters, person groups, and interactions between item clusters and person groups. Explanatory forms of several polytomous item response models – such as Partial Credit Model and Rating Scale Model – are demonstrated and the estimation procedures of these models are explained. Findings of this study suggest that explanatory item response models can be more robust and parsimonious than traditional item response models for polytomous data where items and persons share common characteristics. Explanatory polytomous item response models can provide more information about response patterns in item responses by estimating fewer item parameters.
Keywords
References
- Albano, A. D. (2013). Multilevel modeling of item position effects. Journal of Educational Measurement, 50(4), 408–426. doi:10.1111/jedm.12026
- Adams, R. J., Wu, M. L., & Wilson, M. (2012). The Rasch rating model and the disordered threshold controversy. Educational and Psychological Measurement, 72(4), 547–573. doi: 10.1177/0013164411432166
- American Educational Research Association, American Psychological Association, & National Council on Measurement in Education, & Joint Committee on Standards for Educational and Psychological Testing. (2014). Standards for educational and psychological testing. Washington, DC: AERA.
- Andrich, D. (1978). Application of a psychometric rating model to ordered categories which are scored with successive integers. Applied Psychological Measurement, 2(4) 581–594. doi:10.1177/014662167800200413
- Akaike, H. (1974). A new look at the statistical model identification. IEEE Transactions on Automatic Control, 19(6), 716–723. doi:10.1109/TAC.1974.1100705
- Bates, D., Maechler, M., Bokler, B., & Walker, S. (2014). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67(1), 1–48. doi:10.18637/jss.v067.i01
- Beretvas, S. N. (2008). Cross-classified random effects models. In A. A. O’Connell & D. Betsy McCoach (Eds.), Multilevel modeling of educational data (pp. 161-197). Charlotte, SC: Information Age Publishing.
- Birnbaum, A. (1968). Some latent trait models and their use in inferring an examinee’s ability. In F. M. Lord & M. R. Novick (Eds.), Statistical theories of mental test scores. Reading, MA: Addison–Wesley.
Details
Primary Language
English
Subjects
Studies on Education
Journal Section
Research Article
Authors
Luke Stanke
This is me
0000-0002-4340-6954
United States
Okan Bulut
*
0000-0001-5853-1267
Canada
Publication Date
July 15, 2019
Submission Date
January 19, 2019
Acceptance Date
May 15, 2019
Published in Issue
Year 2019 Volume: 6 Number: 2
Cited By
An Empirical Investigation of Factors Contributing to Item Nonresponse in Self-Reported Bullying Instruments
Journal of School Violence
https://doi.org/10.1080/15388220.2020.1770603Unsupervised Machine Learning Algorithms Examine Healthcare Providers' Perceptions and Longitudinal Performance in a Digital Neonatal Resuscitation Simulator
Frontiers in Pediatrics
https://doi.org/10.3389/fped.2020.00544Estimating Explanatory Extensions of Dichotomous and Polytomous Rasch Models: The eirm Package in R
Psych
https://doi.org/10.3390/psych3030023IRText: An Item Response Theory-Based Approach for Text Categorization
Arabian Journal for Science and Engineering
https://doi.org/10.1007/s13369-021-06238-7Evaluating group differences in online reading comprehension: The impact of item properties
International Journal of Testing
https://doi.org/10.1080/15305058.2022.2044821Group differences in the emotion regulation during test-taking scale: an explanatory item response modeling approach
Current Psychology
https://doi.org/10.1007/s12144-022-03202-yThe Initial Evaluation of a Scale for Measuring Interprofessional Perspective Taking
SSRN Electronic Journal
https://doi.org/10.2139/ssrn.4197308Modeling Item-Level Heterogeneous Treatment Effects With the Explanatory Item Response Model: Leveraging Large-Scale Online Assessments to Pinpoint the Impact of Educational Interventions
Journal of Educational and Behavioral Statistics
https://doi.org/10.3102/10769986231171710Predicting reading comprehension performance based on student characteristics and item properties
Studies in Educational Evaluation
https://doi.org/10.1016/j.stueduc.2023.101309Modelling the Differences in Social and Emotional Skills with Polytomous Explanatory IRT: The Example of Assertiveness Skill
Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi
https://doi.org/10.21031/epod.1299468Disentangling Person-Dependent and Item-Dependent Causal Effects: Applications of Item Response Theory to the Estimation of Treatment Effect Heterogeneity
Journal of Educational and Behavioral Statistics
https://doi.org/10.3102/10769986241240085A longitudinal examination of the psychometric properties of the English perceived stress scale‐ four (PSS‐4) in mental health counsellors using item response theory
Stress and Health
https://doi.org/10.1002/smi.3468Does higher endorsement with collaboration lead to better performance on collaborative problem solving? An explanatory item response approach to cross-cultural comparisons
Frontiers in Psychology
https://doi.org/10.3389/fpsyg.2024.1468533Polytomous explanatory item response models for item discrimination: Assessing negative-framing effects in social-emotional learning surveys
Behavior Research Methods
https://doi.org/10.3758/s13428-025-02625-2Does Vibrato Define Genre or Vice Versa? A Novel Parametric Approach to Vocal Vibrato Analysis
Journal of Voice
https://doi.org/10.1016/j.jvoice.2025.02.046The Impact of Item Position on Item Parameters: A Multi-Method Approach
Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi
https://doi.org/10.21031/epod.1661135Örnek Üretme Yaklaşımı ile Ortaokul Öğrencilerinin İstatistiksel Okuryazarlık Düzeylerinin Belirlenmesi
Ege Eğitim Dergisi
https://doi.org/10.12984/egeefd.1802128Structural and behavioral implications of wording effects: An integrated investigation using exploratory graph analysis and explanatory item response modeling¹
ODÜ Sosyal Bilimler Araştırmaları Dergisi (ODÜSOBİAD)
https://doi.org/10.48146/odusobiad.1810609Validity and fairness of the PISA 2018 Global Competence assessment: an argument-based evaluation via explanatory item response models
Humanities and Social Sciences Communications
https://doi.org/10.1057/s41599-026-06979-6Examining the Psychometric Properties of the Perceived Stress Scale in Black Womens During the Perinatal Period Using Explanatory Rasch Models
Measurement and Evaluation in Counseling and Development
https://doi.org/10.1080/07481756.2025.2585794