The Dif Identification in Constructed Response Items Using Partial Credit Model
Abstract
The study was to identify the load, the type and
the significance of differential item functioning (DIF) in constructed response
item using the partial credit model (PCM). The data in the study were the
students’ instruments and the students’ responses toward the PISA-like test
items that had been completed by 386 ninth grade students and 460 tenth grade
students who had been about 15 years old in the Province of Yogyakarta Special
Region in Indonesia. The analysis toward the item characteristics through the
student categorization based on their class was conducted toward the PCM using
CONQUEST software. Furthermore, by applying these items characteristics, the
researcher draw the category response function (CRF) graphic in order to
identify whether the type of DIF content had been in uniform or non-uniform.
The significance of DIF was identified by comparing the discrepancy between the
difficulty level parameter and the error in the CONQUEST output results. The
results of the analysis showed that from 18 items that had been analyzed there
were 4 items which had not been identified load DIF, there were 5 items that
had been identified containing DIF but not statistically significant and there
were 9 items that had been identified containing DIF significantly. The causes
of items containing DIF were discussed.
Keywords
References
- Acara, T. (2011). Sample size in differential item functioning: An application of hierarchical linear modeling. Kuramve Uygulamada Eğitim Bilimleri (Educational Sciences: Theory & Practice), 11(1), 284-288.
- Adams, R.J. (1992). Item Bias. In Keeves, J.P. (Ed), The IEA technical handbook (pp. 177-187). The Hague: The International Association for the Evaluation of Educational Achiement (IEA).
- Adams, R., & Wu, M. (2010). Differential Item Functioning. Retrieved from https://www.acer.org/files/Conquest-Tutorial-6-DifferentialItemFunctioning.pdf
- Akour, M., Sabah, S., & Hammouri, H. (2015). Net and global differential item fuctioning in PISA polytomously scored science items: application of the differential step functioning framework. Journal of Psychoeducational Assessment. 33(2), 166-176.
- Budiono, B. (2004). Perbandingan metode Mantel-Haenszel, sibtest, regresi logistik, dan perbedaan peluang dalam mendeteksi keberbedaan fungsi butir. Dissertasion. Universitas Negeri Yogyakarta, Indonesia.
- Bulut, O., & Suh, Y. (2017). Functioning with the multiple indicators multiple causes model, the item response theory likelihood ratio test, and logistic regression. Frontiers in Education, October 2017, 1-14.
- Camilli, G., & Shepard, L.A. (1994). Methods for identifying bias test items. Thousand Oaks, CA: Sage Publication.
- Da Costa, P.D., & Araujo, L. (2012). Differential item functioning (DIF): What function differently for Immigrant students in PISA 2009 reading items? JRC Scientific and Policy Reports. Luxembourg: European Commission.
Details
Primary Language
English
Subjects
Studies on Education
Journal Section
Research Article
Authors
Heri Retnawati
Universitas Negeri Yogyakarta
Indonesia
Publication Date
January 1, 2018
Submission Date
August 8, 2017
Acceptance Date
October 26, 2017
Published in Issue
Year 2018 Volume: 5 Number: 1
Cited By
Developing Classroom Assessment Tool using Learning Management System-based Computerized Adaptive Test in Vocational High Schools
Journal of Education Research and Evaluation
https://doi.org/10.23887/jere.v6i1.35630