BibTex RIS Cite

-

Year 2014, Volume: 5 Issue: 1, 72 - 87, 11.03.2014
https://doi.org/10.21031/epod.64124

Abstract

In this study the attitude items in PISA 2009 student questionnaire were analysed in terms of whether or not these items changes based on culture in a comparative way between samples from Turkey-Finland, TurkeyRussia, Singapour-New Zeland and Avustralia-New Zeland. More specifically eleven attitudes items concerning reading were analysed in the study. The choice of countries was based on achievement rank, the characteristics of language and culture. Explatory factor analysis was used to analyse the factor pattern of the questionnaire. Analyses about differential item functioning (DIF) were achieved through the poly-SIBTEST, ordinal logistic regression and item response theory- likelihood ratio. The results showed that the DIF value in the countries with different language and culture is 91%. It was found to be 82% among the countries with the same language and different culture. It was 37% among those countries with the same culture and language. Therefore, not language but the culture seem to cause DIF in the items.

References

  • Allalouf, A., Hambleton, R., &Sireci, S. (1999). Identifying the causes of DIF in translated verbal items. Journal of Educational Measurement, 36, 185–198.
  • Anıl, D. (2009). Uluslararası öğrenci başarilarini değerlendirme programi (pisa)'nda türkiye'deki öğrencilerin fen bilimleri başarilarini etkileyen faktörler. Eğitim ve Bilim, 34, 87-100.
  • Asil, M. (2010). Uluslararası öğrenci değerlendirme programı (PISA) 2006 öğrenci anketinin kültürler arası eşdeğerliğinin incelenmesi. Doktora Tezi, Hacettepe Üniversitesi, Ankara.
  • Asil, M., ve Gelbal, S. (2012). PISA öğrenci anketinin kültürler arası eşdeğerliği. Eğitim ve Bilim, 37(166), 236-2
  • Atalay, K. ve Kelecioğlu, H. (2012). PISA 2006 Öğrenci anketinde yer alan maddelerin değişen madde fonksiyonu açisindan incelenmesi. Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi, 45(2), 7796
  • Ankenmann, R. D., Witt, E. A., & Dunbar, S. B. (1999). An investigation of the power of the likelihood ratio goodness-of-fit statistic in detecting differential item functioning. Journal of Educational Measurement, 36, 277-300.
  • Berberoğlu, G. ve Kalender, İ. (2005). Öğrenci başarısının yıllara, okul türlerine, bölgelere göre incelenmesi: ÖSS ve PISA analizi. Eğitim Bilimleri ve Uygulama, 4(7), 21-35.
  • Basokçu, O. & Öğretmen, T. (2014). Comparison of parametric and nonparametric item response techniques in determining differential item functioning in polytomousscale.American Journal of Theoretical and Applied Statistics, 3(2), 31-38.
  • Camili, G., & Shepard, L. A. (1994). Methods for identifying biased test items. London: Sage Publications.
  • Chang, H., Mazzeo, J., & Roussos, L. A. (1996). Detecting DIF for polytomously scored items: An adaptation of the SIBTEST procedure. Journal of Educational Measurement, 33, 333–353.
  • Clauser, B. E., &Mazor, K. M. (1998).Using statistical procedures to identify differential item functioning test items.Educational Measurement: Issues and Practice, 17, 31–44.
  • Cohen, A. S., Kim, S. -H., & Baker, F. B. (1993).Detection of differential item functioning in the graded response model.Applied Psychological Measurement, 17(4), 335-350.
  • Dodeen, H., &Johanson, G. A. (2003).An analysis of sex-related differential item functioning in attitude assessment.Assessment & Evaluation in Higher Education, 28(2), 129–134.
  • EARGED. (2004). Öğrenci başarısını belirleme programı (PISA-2003), ulusal ön rapor. Ankara: MEBEğitimi Araştırma ve Geliştirme Dairesi Başkanlığı.
  • Ercikan, K. (2002). Disentangling the sources of differential item functioning in multilingual assessments.International Journal of Testing, 2(3&4), 199–215.
  • Elousa, P., &López-Jaşregui, A. (2007). Potential sources of differential item functioning. International Journal of Testing, 7 (1), 39–52.
  • Gierl, M. J., Jodoin, M., & Ackerman T. (2000).Performance of Mantel-Haenszel, simultaneous item bias test and logistic regression when the proportion of DIF items is large.Paper presented at the AnnualMeeting of the American Educational Research Association, New Orleans, Louisiana, USA.
  • Hambleton, R. K., &Patsula, L. (1999).Increasing the validity of adapted tests: myths to be avoided and guidelines for improving test adaptation practices.Journal of Applied Testing Technology, 1(1), 1–30. Holland, P. W., &Wainer, H. (1993).Differential item functioning. Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Hulin, C. L., Drasgow, F., & Parsons, C. K. (1983).Item response theory: Application to psychological measurement. Homewood, IL: Dow Jones-Irwin.
  • Kim, S. -H., & Cohen, A. S. (1998).Detection of differential item functioning under the graded response model with the likelihood ratio test.Applied Psychological Measurement, 22(4), 345-355.
  • Le, L. T. (2009).Investigation gender differential item functioning across countries ABD test languages for PISA science items.International Journal of Testing, 9(2), 122–133.
  • Milli Eğitim Bakanlığı [MEB] (2010). PISA 2009 Uluslararası öğrenci değerlendirme programı ulusal ön rapor. Ankara: MEB Yayınları.
  • Miller, T. R., & Spray, J. A. (1993). Logistic discriminant function analysis for DIF identification of polytomously scored items. Journal of Educational Measurement, 30(2), 107–122.
  • Öğretmen, T. & Başokçu, T. (2013). Öğretmen öz yeterlilik ölçeğinde değişen madde fonksiyonlarının ağırlıklandırılmış cevap modeli ile belirlenmesi. Ege Eğitim Dergisi, 14(2), 63-78.
  • Rindermann, H. (2007). The g-factor of international cognitive abilitycomparisons: The homogeneity of results in PISA, TIMSS, PIRLS and IQ-tests across nations. European Journal of Personality, 21(5), 667– 70
  • Roussos, L.L., & Stout, W. F. (1996). Simulation studies of the effects of small sample size and studied item parameters on SIBTEST and Mantel-Haenszel type I error performance. Journal of Educational Measurement, 33(2), 215–230.
  • Schulz, W. (7–11 April 2005). Testing parameter invariance for questionnaire indices using confirmatory factor analysis and item response theory. Paper Presented at theAnnual Meetings of the American Educational Research Association (AERA), San Francisco, USA.
  • Somer, O., Korkmaz, M., Dural, S., & Can, S. (2009). Detection of measurement equivalence by structural equation modeling and item response theory.Turkish Journal of Psychology, 24(64).
  • Van de Vijver, F. J. R., &Tanzer, N. K. (2004).Bias and equivalence in cross-cultural assessment.European Review of Applied Psychology, 54, 119–135.
  • PISA Web Site: www.pısa.oecd.org adresinden 08.06.2012 tarihinde indirilmiştir.
  • Taşdemir, M., ve Taşdemir, A. (2008). A comparison of Turkish primary school students’ achievement in science and maths subjects. Journal of Qafqaz University, 22(2). 190-198.
  • Thissen, D. (2001). IRTLRDIF v.2.0b: Software for the computation of the statistics involved in item response theory likelihood-ratio tests for differential item functioning. University of North Carolina at Chapel Hill: L. L. ThurstonePsychometric Laboratory.
  • Yıldırım, H. H. &Berberoğlu, G. (2006).Judgmental and statistical analyses of the PISA 2003 mathematics literacy items. International Journal of Testing, 9(2), 108–121.
  • Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF): Logistic regression modeling as a unitary framework for binary and likert-type (Ordinal) item scores. Ottawa, ON: Directorate of Human Resources Research and Evaluation, Department of National Defense.
  • Zumbo, B. D., &Gelin, M. N. (2003). Differential item functioning results may change depending on how an item is scored: An illustration with the center for epidemiologic studies depression scale. Educational and Psychological Measurement, 63, 65–74.
  • Zumbo, B. D., & Thomas, D. R. (1996). A measure of effect size using logistic regression procedures. Paper presented at the National Board of Medical Examiners, Philadelphia, PA.

PISA2009 Öğrenci Anketi Tutum Maddelerinin Kültüre Göre Değişen Madde Fonksiyonu Açısından İncelenmesi

Year 2014, Volume: 5 Issue: 1, 72 - 87, 11.03.2014
https://doi.org/10.21031/epod.64124

Abstract

Bu araştırmada Uluslararası Öğrenci Değerlendirme Programı (PISA) 2009 kapsamında uygulanan öğrenci anketinin kültürlere göre değişen madde fonksiyonu (DMF) gösterip göstermediği Türkiye-Finlandiya, Türkiye-Rusya, Singapur-Yeni Zelanda ve Avustralya-Yeni Zelanda örneklemleri kullanılarak incelenmiştir. Araştırmada PISA'da yer alan 11 maddelik okumaya yönelik tutum maddeleri kullanılmıştır.Araştırmaya bu ülkelerin dahil edilmesinde ülkelerin başarı sıralaması, dil ve kültür özellikleri dikkate alınmıştır. İlk olarak öğrenci anketinin faktör yapısı açımlayıcı faktör analizi ile incelenmiştir. Daha sonra DMF analizleri poly-SIBTEST, ordinal lojistik regresyon ve madde tepki kuramı olabilirlik oranı yöntemleri ile yapılmıştır. Farklı dil ve farklı kültüre sahip ülkelerde maddelerin %91'i DMF gösterirken, bu oran farklı kültür ve aynı dilde %82'ye, aynı kültür ve aynı dilde %37'ye düşmüştür. Bu çalışmada ortaya çıkan sonuç, dil fakından ziyade kültür farklılıklarının maddelerin DMF göstermesinde etkili olduğudur. 

References

  • Allalouf, A., Hambleton, R., &Sireci, S. (1999). Identifying the causes of DIF in translated verbal items. Journal of Educational Measurement, 36, 185–198.
  • Anıl, D. (2009). Uluslararası öğrenci başarilarini değerlendirme programi (pisa)'nda türkiye'deki öğrencilerin fen bilimleri başarilarini etkileyen faktörler. Eğitim ve Bilim, 34, 87-100.
  • Asil, M. (2010). Uluslararası öğrenci değerlendirme programı (PISA) 2006 öğrenci anketinin kültürler arası eşdeğerliğinin incelenmesi. Doktora Tezi, Hacettepe Üniversitesi, Ankara.
  • Asil, M., ve Gelbal, S. (2012). PISA öğrenci anketinin kültürler arası eşdeğerliği. Eğitim ve Bilim, 37(166), 236-2
  • Atalay, K. ve Kelecioğlu, H. (2012). PISA 2006 Öğrenci anketinde yer alan maddelerin değişen madde fonksiyonu açisindan incelenmesi. Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi, 45(2), 7796
  • Ankenmann, R. D., Witt, E. A., & Dunbar, S. B. (1999). An investigation of the power of the likelihood ratio goodness-of-fit statistic in detecting differential item functioning. Journal of Educational Measurement, 36, 277-300.
  • Berberoğlu, G. ve Kalender, İ. (2005). Öğrenci başarısının yıllara, okul türlerine, bölgelere göre incelenmesi: ÖSS ve PISA analizi. Eğitim Bilimleri ve Uygulama, 4(7), 21-35.
  • Basokçu, O. & Öğretmen, T. (2014). Comparison of parametric and nonparametric item response techniques in determining differential item functioning in polytomousscale.American Journal of Theoretical and Applied Statistics, 3(2), 31-38.
  • Camili, G., & Shepard, L. A. (1994). Methods for identifying biased test items. London: Sage Publications.
  • Chang, H., Mazzeo, J., & Roussos, L. A. (1996). Detecting DIF for polytomously scored items: An adaptation of the SIBTEST procedure. Journal of Educational Measurement, 33, 333–353.
  • Clauser, B. E., &Mazor, K. M. (1998).Using statistical procedures to identify differential item functioning test items.Educational Measurement: Issues and Practice, 17, 31–44.
  • Cohen, A. S., Kim, S. -H., & Baker, F. B. (1993).Detection of differential item functioning in the graded response model.Applied Psychological Measurement, 17(4), 335-350.
  • Dodeen, H., &Johanson, G. A. (2003).An analysis of sex-related differential item functioning in attitude assessment.Assessment & Evaluation in Higher Education, 28(2), 129–134.
  • EARGED. (2004). Öğrenci başarısını belirleme programı (PISA-2003), ulusal ön rapor. Ankara: MEBEğitimi Araştırma ve Geliştirme Dairesi Başkanlığı.
  • Ercikan, K. (2002). Disentangling the sources of differential item functioning in multilingual assessments.International Journal of Testing, 2(3&4), 199–215.
  • Elousa, P., &López-Jaşregui, A. (2007). Potential sources of differential item functioning. International Journal of Testing, 7 (1), 39–52.
  • Gierl, M. J., Jodoin, M., & Ackerman T. (2000).Performance of Mantel-Haenszel, simultaneous item bias test and logistic regression when the proportion of DIF items is large.Paper presented at the AnnualMeeting of the American Educational Research Association, New Orleans, Louisiana, USA.
  • Hambleton, R. K., &Patsula, L. (1999).Increasing the validity of adapted tests: myths to be avoided and guidelines for improving test adaptation practices.Journal of Applied Testing Technology, 1(1), 1–30. Holland, P. W., &Wainer, H. (1993).Differential item functioning. Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Hulin, C. L., Drasgow, F., & Parsons, C. K. (1983).Item response theory: Application to psychological measurement. Homewood, IL: Dow Jones-Irwin.
  • Kim, S. -H., & Cohen, A. S. (1998).Detection of differential item functioning under the graded response model with the likelihood ratio test.Applied Psychological Measurement, 22(4), 345-355.
  • Le, L. T. (2009).Investigation gender differential item functioning across countries ABD test languages for PISA science items.International Journal of Testing, 9(2), 122–133.
  • Milli Eğitim Bakanlığı [MEB] (2010). PISA 2009 Uluslararası öğrenci değerlendirme programı ulusal ön rapor. Ankara: MEB Yayınları.
  • Miller, T. R., & Spray, J. A. (1993). Logistic discriminant function analysis for DIF identification of polytomously scored items. Journal of Educational Measurement, 30(2), 107–122.
  • Öğretmen, T. & Başokçu, T. (2013). Öğretmen öz yeterlilik ölçeğinde değişen madde fonksiyonlarının ağırlıklandırılmış cevap modeli ile belirlenmesi. Ege Eğitim Dergisi, 14(2), 63-78.
  • Rindermann, H. (2007). The g-factor of international cognitive abilitycomparisons: The homogeneity of results in PISA, TIMSS, PIRLS and IQ-tests across nations. European Journal of Personality, 21(5), 667– 70
  • Roussos, L.L., & Stout, W. F. (1996). Simulation studies of the effects of small sample size and studied item parameters on SIBTEST and Mantel-Haenszel type I error performance. Journal of Educational Measurement, 33(2), 215–230.
  • Schulz, W. (7–11 April 2005). Testing parameter invariance for questionnaire indices using confirmatory factor analysis and item response theory. Paper Presented at theAnnual Meetings of the American Educational Research Association (AERA), San Francisco, USA.
  • Somer, O., Korkmaz, M., Dural, S., & Can, S. (2009). Detection of measurement equivalence by structural equation modeling and item response theory.Turkish Journal of Psychology, 24(64).
  • Van de Vijver, F. J. R., &Tanzer, N. K. (2004).Bias and equivalence in cross-cultural assessment.European Review of Applied Psychology, 54, 119–135.
  • PISA Web Site: www.pısa.oecd.org adresinden 08.06.2012 tarihinde indirilmiştir.
  • Taşdemir, M., ve Taşdemir, A. (2008). A comparison of Turkish primary school students’ achievement in science and maths subjects. Journal of Qafqaz University, 22(2). 190-198.
  • Thissen, D. (2001). IRTLRDIF v.2.0b: Software for the computation of the statistics involved in item response theory likelihood-ratio tests for differential item functioning. University of North Carolina at Chapel Hill: L. L. ThurstonePsychometric Laboratory.
  • Yıldırım, H. H. &Berberoğlu, G. (2006).Judgmental and statistical analyses of the PISA 2003 mathematics literacy items. International Journal of Testing, 9(2), 108–121.
  • Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF): Logistic regression modeling as a unitary framework for binary and likert-type (Ordinal) item scores. Ottawa, ON: Directorate of Human Resources Research and Evaluation, Department of National Defense.
  • Zumbo, B. D., &Gelin, M. N. (2003). Differential item functioning results may change depending on how an item is scored: An illustration with the center for epidemiologic studies depression scale. Educational and Psychological Measurement, 63, 65–74.
  • Zumbo, B. D., & Thomas, D. R. (1996). A measure of effect size using logistic regression procedures. Paper presented at the National Board of Medical Examiners, Philadelphia, PA.
There are 36 citations in total.

Details

Primary Language Turkish
Journal Section Articles
Authors

Bilge Gök

Kübra Atalay Kabasakal

Hülya Kelecioğlu

Publication Date March 11, 2014
Published in Issue Year 2014 Volume: 5 Issue: 1

Cite

APA Gök, B., Atalay Kabasakal, K., & Kelecioğlu, H. (2014). PISA2009 Öğrenci Anketi Tutum Maddelerinin Kültüre Göre Değişen Madde Fonksiyonu Açısından İncelenmesi. Journal of Measurement and Evaluation in Education and Psychology, 5(1), 72-87. https://doi.org/10.21031/epod.64124