TY - JOUR T1 - Standart MIMIC ve Arınık Çapa Maddeli MIMIC Yöntemleriyle Gerçekleştirilen DMF Çalışması TT - A DIF Study by using Standard MIMIC and MIMIC With a Pure Anchor AU - Tamcı, Pelin AU - Kelecioğlu, Hülya PY - 2024 DA - August Y2 - 2024 DO - 10.17152/gefad.1471797 JF - Gazi Eğitim Fakültesi Dergisi JO - GUJGEF PB - Gazi University WT - DergiPark SN - 1301-9058 SP - 1001 EP - 1022 VL - 44 IS - 2 LA - tr AB - Bu çalışmanın amacı değişen madde fonksiyonu (DMF) belirleme yöntemlerinden standart MIMIC (M-ST) ve arınık çapa maddeli MIMIC (M-AÇM) yöntemlerinin benzerlik ve farklılıklarını belirlemektir. Araştırma PISA 2012 verileri ile 900 ve 450 kişilik örneklemler üzerinde referans-odak grup oranları 550/350, 450/450, 300/150, 225/225 olacak şekilde yürütülmüştür. M-ST ve M-AÇM yöntemleri ile belirlenen DMF’li maddeler karşılaştırıldığında, tüm koşullarda M-AÇM yöntemiyle daha az sayıda DMF’li madde belirlendiği görülmüştür. 26 nolu maddenin tüm koşullarda her iki DMF belirleme yöntemiyle de DMF gösterdiği belirlenmiştir. En çok DMF’li madde sayısının M-ST yönteminde 900 kişilik örneklem büyüklüğü için referans-odak oranı 550/350 olduğunda ortaya çıktığı görülmüştür. KW - Standart MIMIC KW - arınık çapa maddeli MIMIC KW - DMF N2 - The aim of this study is to determine the similarities and differences between standard MIMIC (M-ST) and MIMIC with a pure anchor (M-PA) methods, which are differential item function (DIF) determination methods. The research was conducted on samples of 900 and 450 people, with reference-focus group ratios of 550/350, 450/450, 300/150, 225/225. When the items with DIF determined by the M-ST and M-PA methods were compared, it was seen that fewer DIF items were determined by the M-PA method in all conditions. It was seen that item 26 showed DIF in all conditions with both DIF determination methods. It was observed that the highest number of items with DIF occurred when the reference-focus ratio was 550/350 for a sample of 900 people in the M-ST method. CR - Camilli, G., & Shepard, L. (1994). Methods for identifying biased test items. London: Sage. CR - Clauser, B., & Mazor, K. (1998). Using statistical procedures to identify differentially functioning test items. Educational Measurement: Issues and Practice, 17, 31-44. doi: 10.1111/j.1745-3992.1998.tb00619.x CR - Dorans, N. J., & Holland, P. W. (1993). DIF detection and description: Mantel-Haenszel and standardization. In P. W. Holland, & H. Wainer, Differential Item Functioning (s. 35-66). Hillsdale, NJ: Lawrence Erlbaum Associates. CR - Finch, H. (2005). The MIMIC model as a method for detecting DIF: Comparison with Mantel-Haenszel, SIBTEST, and the IRT likelihood ratio. Applied Psychological Measurement, 29, 278-295. doi: 10.1177/0146621605275728 CR - Field, A. (2009). Discovering Statistics Using SPSS. London: SAGE. CR - Gallo, J. J., Anthony, J. C., & Muthen, B. O. (1994). Age differences in the symptoms of depression: A latent trait analysis. Journal of Gerontology: Psychological Sciences, 49, P251-P264. doi: 10.1093/geronj/49.6.p251 CR - Hallquist, M. N. & Wiley, J. F. (2018). MplusAutomation: An R package for faciliting large-scale latent variable analyses in mplus. Structural Equation Modelling, 25, 621-638. doi: 10.1080/10705511.2017.1402334. CR - Holland, P. W., & Thayer, D. T. (1988). Differential item performance and the Mantel-Haenszel procedure. In H. Wainer, & H. I. Braun (Eds.). Test validity, 129- 145. Hillsdale, NJ: Erlbaum CR - Lord, F. M. (1980). Applications of item response theory to practical testing problems. Hillsdale NJ: Erlbaum. CR - Muthén, L. K. & Muthén, B. O. (1998, 2010). Mplus user’s guide (6th ed.). Los Angeles, CA: Muthén & Muthén CR - OECD (2023), “What is PISA?”, in PISA 2022 Assessment and Analytical Framework, OECD Publishing, Paris. doi: https://doi.org/10.1787/82f078d7-en CR - OECD (2010). Translation and adaptation guidelines for PISA 2012. Budapeşte: OECD Publishing. CR - Oort, F. J. (1998). Simulation study of item bias detection with restricted factor analysis. Structural Equation Modeling, 5, 107-124. CR - Shealy, R. & Stout, W. (1993). A model-based standardization approach that separates true bias/DIF from group ability differences and detects test bias/DTF as well as bias/DIF. Psychometrika, 58(2), 159-194. doi:10.1007/BF02294572 CR - Shih, C.-L. & Wang,W.-C.(2009).Differential item functioning detection using the multiple indicators, multiple causes MIMIC method with a pure short anchor. Applied Psychological Measurement, 33, 184-199. doi: 10.1177/0146621608321758 CR - Swaminathan, H., & Rogers, H. J. (1990). Detecting differential item functioning using logistic resgression procedures. Journal of Educational Measurement, 27(4), 361-370. CR - Tamcı, P. (2018). Kayıp Veriyle Başa Çıkma Yöntemlerinin Değişen Madde Fonksiyonu Üzerindeki Etkisinin İncelenmesi (Yüksek lisans tezi). Hacettepe Üniversitesi , Eğitim Bilimleri Enstitüsü, Ankara. CR - Wainer, H., Sireci, S., & Thissen, D. (1991). Differential testlet functioning: Definitions and detection. Journal of Educational Measurement, 28, 197-219. CR - Wang, W.-C., & Yeh, Y.-L. (2003). Effects of anchor item methods on differential item functioning detection with the likelihood ratio test. Applied Psychological Measurement, 27, 479-498. doi: 10.1177/0146621603259902 CR - Wang, W.-C. (2004). Effects of anchor item methods on the detection of differential item functioning within the family of Rasch models. Journal of Experimental Education, 72, 221-261. https://doi.org/10.3200/JEXE.72.3.221-261 CR - Wang, W.-C., Shih, C.-L., & Yang, C.-C. (2009). The MIMIC method with scale purification for detecting differential item functioning. Educational and Psychological Measurement, 69(5), 713-731. https://doi.org/10.1177/0013164409332228 CR - Wang, W. C., & Shih, C. L. (2010). MIMIC methods for assessing differential item functioning in polytomous items. Applied Psychological Measurement, 34(3), 166–180. doi: 10.1177/0146621609355279 CR - Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF): Logistic regression modeling as a unitary framework for binary and likert-type (ordinal) item scores. Ottowa, ON: Directorate of Human Resources Research and Evaluation, Department of National Defense UR - https://doi.org/10.17152/gefad.1471797 L1 - https://dergipark.org.tr/en/download/article-file/3877493 ER -