Araştırma Makalesi
BibTex RIS Kaynak Göster

DINA Modele Göre Testin Psikometrik Özelliklerinin Belirlenmesi ve Bireysel Dönüt Verilmesi: TIMSS 2015

Yıl 2019, Cilt: 17 Sayı: 1, 103 - 120, 26.06.2019

Öz

Çalışmanın amacı, testin psikometrik
özelliklerinin belirlenmesinde ve öğrencilere bireysel dönüt verilmesinde DINA
modelin nasıl kullanılacağını göstermektir. Bu amaç doğrultusunda çalışmada
TIMSS 2015 Türkiye örnekleminde yer alan 8. sınıf Matematik testinin kitapçık-1’deki
çoktan seçmeli madde yanıtları kullanılmıştır. Kitapçık-1’i alan 435 öğrenci
vardır. DINA model parametreleri R Studio yazılımı kullanılarak kestirilmiş ve
KTK ve MTK parametreleri ile karşılaştırmalı olarak yorumlanmıştır. Ayrıca,
bireyselleştirilmiş geribildirim için hazırlanan tanılayıcı profil rapor örneği
verilmiştir. Çalışma sonucunda, DINA modelin iyi düzeyde uyum gösterdiği
belirlenmiştir (SRMSR, MADcor, MADQ3, MADaQ3 ve RMSEA< 0.05). DINA modele
göre incelenen madde parametrelerinin KTK ve MTK parametreleri ile benzer
olduğu bulunmuştur. DINA, KTK ve MTK ile elde edilen güvenirlik değerleri
sırasıyla (Pc)= 0.913,  KR-20=0.80 ve marjinal
güvenirlik=0.70 şeklindedir. DINA modele göre elde edilen güvenirlik değeri KTK
ve MTK’den daha büyüktür. Bu sonuçlar doğrultusunda testin psikometrik
özelliklerinin belirlenmesinde DINA modelin kullanılabileceği önerilmektedir.
Ayrıca KTK ve MTK’den farklı olarak ayrıntılı bireyselleştirilmiş geribildirim
için BTM çerçevesinin kullanılması önerilmektedir.

Kaynakça

  • Awopeju, O. A., & Afolabi, E. R. I. (2016). Comparative analysis of classical test theory and item response theory based item parameter estimates of senior school certificate mathematics examination. European Scientific Journal, ESJ, 12(28).
  • De Ayala, R. J. (2009). Theory and practice of item response theory. Guilford Publications.
  • De la Torre, J., & Douglas, J. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69 (3), 333-353.
  • De la Torre, J. (2007). Evaluation of model fit in a large-scale assessment application of cognitive diagnosis. In Presentation at the annual meeting of the national council on measurement in education, Chicago, IL.
  • De la Torre, J. (2009). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34, 115-130.
  • De la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76(2), 179-199.
  • De La Torre, J., & Minchen, N. (2014). Cognitively diagnostic assessments and the cognitive diagnosis model framework. Psicología Educativa, 20(2), 89-97.
  • DiBello, L. V., Roussos, L. A., & Stout, W. F. (2007). Review of cognitively diagnostic assessment and a summary of psychometric models. In C. R. Rao & S. Sinharay (Eds.), Handbook of statistics. Volume 26: Psychometrics (pp. 979-1030). Amsterdam, The Netherlands: Elsevier.
  • Ebel, R.L., &Frisbie, D.A. (2009). Essentials Of Educational Measurement (5th ed.). New Delhi: Prentice-Hall Of India Pvt. Limited.
  • Embretson, S. (1984). A general latent trait model for response processes. Psychometrika, 49(2), 175-186. doi: 10.1007/BF02294171
  • Embretson, S.E. (1997). Multicomponent response models. In: van der Linden,W.J., Hambleton, R.L. (Eds.),Handbook of Modern Item Response Theory. New York: Springer, pp. 305–321.
  • George, A. C., & Robitzsch, A. (2015). Cognitive diagnosis models in R: A didactic. The Quantitative Methods for Psychology, 11(3), 189-205.
  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. London: Sage.
  • Hartz, S., Roussos, L., & Stout, W. (2002). Skills diagnosis: Theory and practice. User Manual for Arpeggio software. ETS.
  • Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural equation modeling: a multidisciplinary journal, 6(1), 1-55.
  • Jang, E. E. (2005). A validity narrative: Effects of reading skills diagnosis on teaching and learning in the context of NG TOEFL (Doctoral dissertation, University of Illinois at Urbana-Champaign).
  • Junker, B.W., & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258–272.
  • Kehoe, J. (1995). Basic Item Analysis for Multiple-Choice Tests. ERIC/AE Digest.
  • Lei, P. W., & Li, H. (2016). Performance of Fit Indices in Choosing Correct Cognitive Diagnostic Models and Q-Matrices. Applied Psychological Measurement, 1-13.DOI: 10.1177/0146621616647954
  • Li, F. (2008). A modified higher-order DINA model for detecting differential item functioning and differential attribute functioning (Doctoral dissertation, uga).
  • Li, X., & Wang, W. C. (2015). Assessment of differential item functioning under cognitive diagnosis models: The DINA model example. Journal of Educational Measurement, 52(1), 28-54.
  • Maydeu-Olivares, A., Cai, L., & Hernández, A. (2011). Comparing the Fit of Item Response Theory and Factor Analysis Models. Structural Equation Modeling: A Multidisciplinary Journal, 18(3), 333–356. doi:10.1080/10705511.2011.581993
  • Maydeu-Olivares. (2013). Goodness-of-fit assessment of item response theory models. Measurement: Interdisciplinary Research and Perspectives, 11, 71–137.doi:10.1080/15366367.2013.831680
  • Maydeu-Olivares, A., & Joe, H. (2014). Assessing Approximate Fit in Categorical Data Analysis. Multivariate Behavioral Research, 49(4), 305–328. doi:10.1080/00273171.2014.911075
  • MEB (2017). Eğitimde Öğrenci Gelişimini İzleme Değerlendirme Sistemi. 01 Kasım 2017 tarihinde http://odsgm.meb.gov.tr/www/egitimde-ogrenci-gelisimini-izleme-degerlendirme-sistemi/icerik/257 sayfasında erişilmiştir.
  • Ravand, H. (2015). Application of a cognitive diagnostic model to a high-stakes reading comprehension test. Journal of Psychoeducational Assessment, 1-18.
  • Ravand, H., & Robitzsch, A. (2015). Cognitive Diagnostic Modeling Using R. Practical Assessment, Research & Evaluation, 20(11), 1-12.
  • Rupp, A. A., & Templin, J. L. (2008). Unique characteristics of diagnostic classification models: A comprehensive review of the current state of the art. Measurement, 6(4), 219-262.
  • Tatsuoka, K. K. (1985). A probabilistic model for diagnosing misconceptions by the pattern classification approach. Journal of Educational and Behavioral Statistics, 10, 55-73.
  • Templin, J., & Bradshaw, L. (2013). Measuring the reliability of diagnostic classification model examinee estimates. Journal of Classification, 30(2), 251-275.
  • Yen, W. M. (1984). Effects of local item dependence on the fit and equating performance of the three-parameter logistic model. Applied Psychological Measurement, 8, 125-145. doi:10.1177/014662168400800201
  • Zhang,W. (2006). Detecting differential item functioning using the DINA model. Unpublished doctoral dissertation, University of North Carolina at Greensboro.
Yıl 2019, Cilt: 17 Sayı: 1, 103 - 120, 26.06.2019

Öz

Kaynakça

  • Awopeju, O. A., & Afolabi, E. R. I. (2016). Comparative analysis of classical test theory and item response theory based item parameter estimates of senior school certificate mathematics examination. European Scientific Journal, ESJ, 12(28).
  • De Ayala, R. J. (2009). Theory and practice of item response theory. Guilford Publications.
  • De la Torre, J., & Douglas, J. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69 (3), 333-353.
  • De la Torre, J. (2007). Evaluation of model fit in a large-scale assessment application of cognitive diagnosis. In Presentation at the annual meeting of the national council on measurement in education, Chicago, IL.
  • De la Torre, J. (2009). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34, 115-130.
  • De la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76(2), 179-199.
  • De La Torre, J., & Minchen, N. (2014). Cognitively diagnostic assessments and the cognitive diagnosis model framework. Psicología Educativa, 20(2), 89-97.
  • DiBello, L. V., Roussos, L. A., & Stout, W. F. (2007). Review of cognitively diagnostic assessment and a summary of psychometric models. In C. R. Rao & S. Sinharay (Eds.), Handbook of statistics. Volume 26: Psychometrics (pp. 979-1030). Amsterdam, The Netherlands: Elsevier.
  • Ebel, R.L., &Frisbie, D.A. (2009). Essentials Of Educational Measurement (5th ed.). New Delhi: Prentice-Hall Of India Pvt. Limited.
  • Embretson, S. (1984). A general latent trait model for response processes. Psychometrika, 49(2), 175-186. doi: 10.1007/BF02294171
  • Embretson, S.E. (1997). Multicomponent response models. In: van der Linden,W.J., Hambleton, R.L. (Eds.),Handbook of Modern Item Response Theory. New York: Springer, pp. 305–321.
  • George, A. C., & Robitzsch, A. (2015). Cognitive diagnosis models in R: A didactic. The Quantitative Methods for Psychology, 11(3), 189-205.
  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. London: Sage.
  • Hartz, S., Roussos, L., & Stout, W. (2002). Skills diagnosis: Theory and practice. User Manual for Arpeggio software. ETS.
  • Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural equation modeling: a multidisciplinary journal, 6(1), 1-55.
  • Jang, E. E. (2005). A validity narrative: Effects of reading skills diagnosis on teaching and learning in the context of NG TOEFL (Doctoral dissertation, University of Illinois at Urbana-Champaign).
  • Junker, B.W., & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258–272.
  • Kehoe, J. (1995). Basic Item Analysis for Multiple-Choice Tests. ERIC/AE Digest.
  • Lei, P. W., & Li, H. (2016). Performance of Fit Indices in Choosing Correct Cognitive Diagnostic Models and Q-Matrices. Applied Psychological Measurement, 1-13.DOI: 10.1177/0146621616647954
  • Li, F. (2008). A modified higher-order DINA model for detecting differential item functioning and differential attribute functioning (Doctoral dissertation, uga).
  • Li, X., & Wang, W. C. (2015). Assessment of differential item functioning under cognitive diagnosis models: The DINA model example. Journal of Educational Measurement, 52(1), 28-54.
  • Maydeu-Olivares, A., Cai, L., & Hernández, A. (2011). Comparing the Fit of Item Response Theory and Factor Analysis Models. Structural Equation Modeling: A Multidisciplinary Journal, 18(3), 333–356. doi:10.1080/10705511.2011.581993
  • Maydeu-Olivares. (2013). Goodness-of-fit assessment of item response theory models. Measurement: Interdisciplinary Research and Perspectives, 11, 71–137.doi:10.1080/15366367.2013.831680
  • Maydeu-Olivares, A., & Joe, H. (2014). Assessing Approximate Fit in Categorical Data Analysis. Multivariate Behavioral Research, 49(4), 305–328. doi:10.1080/00273171.2014.911075
  • MEB (2017). Eğitimde Öğrenci Gelişimini İzleme Değerlendirme Sistemi. 01 Kasım 2017 tarihinde http://odsgm.meb.gov.tr/www/egitimde-ogrenci-gelisimini-izleme-degerlendirme-sistemi/icerik/257 sayfasında erişilmiştir.
  • Ravand, H. (2015). Application of a cognitive diagnostic model to a high-stakes reading comprehension test. Journal of Psychoeducational Assessment, 1-18.
  • Ravand, H., & Robitzsch, A. (2015). Cognitive Diagnostic Modeling Using R. Practical Assessment, Research & Evaluation, 20(11), 1-12.
  • Rupp, A. A., & Templin, J. L. (2008). Unique characteristics of diagnostic classification models: A comprehensive review of the current state of the art. Measurement, 6(4), 219-262.
  • Tatsuoka, K. K. (1985). A probabilistic model for diagnosing misconceptions by the pattern classification approach. Journal of Educational and Behavioral Statistics, 10, 55-73.
  • Templin, J., & Bradshaw, L. (2013). Measuring the reliability of diagnostic classification model examinee estimates. Journal of Classification, 30(2), 251-275.
  • Yen, W. M. (1984). Effects of local item dependence on the fit and equating performance of the three-parameter logistic model. Applied Psychological Measurement, 8, 125-145. doi:10.1177/014662168400800201
  • Zhang,W. (2006). Detecting differential item functioning using the DINA model. Unpublished doctoral dissertation, University of North Carolina at Greensboro.
Toplam 32 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Eğitim Üzerine Çalışmalar
Bölüm Makaleler
Yazarlar

Mahmut Sami Koyuncu 0000-0002-6651-4851

Ayşenur Erdemir 0000-0001-9656-0878

Sinem Şenferah 0000-0001-7932-7644

Yayımlanma Tarihi 26 Haziran 2019
Gönderilme Tarihi 18 Şubat 2019
Kabul Tarihi 18 Nisan 2019
Yayımlandığı Sayı Yıl 2019 Cilt: 17 Sayı: 1

Kaynak Göster

APA Koyuncu, M. S., Erdemir, A., & Şenferah, S. (2019). DINA Modele Göre Testin Psikometrik Özelliklerinin Belirlenmesi ve Bireysel Dönüt Verilmesi: TIMSS 2015. Türk Eğitim Bilimleri Dergisi, 17(1), 103-120.

                                                                                                    Türk Eğitim Bilimleri Dergisi Gazi Üniversitesi Rektörlüğü tarafından yayınlanmaktadır.

                                                                                                                                      Creative Commons Lisansı