Research Article
BibTex RIS Cite

TIMSS 2015 ve 2019 Matematik Sorularının Türkiye’de Cinsiyete Göre Madde Yanlılığının İncelenmesi: SIBTEST Prosedürü ile Değişen Madde Fonksiyonu Analizi

Year 2022, Volume: 23 Issue: Özel Sayı, 211 - 258, 26.03.2022

Abstract

Bu çalışmanın amacı TIMSS 2015 uluslararası sınavında 224 adet, TIMSS 2019’da ise 259 adet olmak üzere toplam 483 adet 8. sınıf matematik sorusunun Türkiye’de kız ve erkek öğrenciler arasında madde yanlılığına sahip olup olmadıklarını tespit etmek, varsa madde yanlılığı içeren soruların TIMSS kavramsal çerçevesi açısından ayrıntılı incelemelerini gerçekleştirmektir. Çalışmanın örneklemini TIMSS 2015’de 3,058 (nkız=1,577, nerkek=1,481) ve TIMSS 2019’da 2,022 (nkız=1,027, nerkek=995) olmak üzere toplam 5,080 öğrenci oluşturmaktadır. Verilerin analizi için öncelikle iki sınavda kullanılan matematik kitapçıkları için Doğrulayıcı Faktör Analizi gerçekleştirildikten sonra, Değişen Madde Fonksiyonu (DMF) için özel bir yöntem olan SIBTEST prosedürü kullanılmıştır. Sonuç olarak, TIMSS 2015’de toplam 224 matematik sorusunun 25’inin, TIMSS 2019’da ise, 259 matematik sorusunun 15’inin madde yanlılığına sahip olduğu sonucuna ulaşılmıştır. Bu soruların cinsiyetlere göre ve nihayetinde de TIMSS kavramsal çerçevesinde yer alan içerik ve bilişsel alanlara göre dağılımları sunulmuştur. Bulgular alan yazın doğrultusunda tartışılmış ve buna bağlı öneriler de sunulmuştur.

References

  • Aaronson, D., Barrow, L. & Sander, W. (2007). Teachers and student achievement in the Chicago public high schools. Journal of Labor Economics, 25(1), 95-135.
  • Aksu, M. (2001). Student performance in dealing with fractions. The Journal of Educational Research, 90(6), 375-380.
  • Akyüz, G. (2006). Investigation of the effect of teacher and class characteristics on mathematics achievement in Turkey and European Union countries. Elementary Education Online, 5(2), 75-86.
  • Akyüz, G. (2014). The effects of student and school factors on mathematics achievement in TIMSS 2011. Egitim ve Bilim, 39(172), 150-162.
  • Akyüz, G. & Berberoğlu, G. (2010). Teacher and classroom characteristics and their relations to mathematics achievement of the students in the TIMSS. New Horizons in Education, 58(1), 77-95.
  • Alacacı, C. & Erbaş, A. K. (2010). Unpacking the inequality among Turkish schools: Findings from PISA 2006. International Journal of Educational Development, 30(2), 182-192.
  • Atar, B. (2011). Application of descriptive and explanatory item response models to TIMSS 2007 Turkey mathematics data. Egitim ve Bilim, 36(159), 255-269.
  • Awuor, R. A. (2008). Effect of unequal sample sizes on the power of DIF detection: An IRT-based Monte Carlo study with SIBTEST and Mantel-Haenszel procedures (Doctoral dissertation, Virginia Polytechnic Institute and State University).
  • Bakan-Kalaycıoğlu, D. & Kelecioğlu, H. (2011). Öğrenci Seçme Sınavı’nın madde yanlılığı açısından incelenmesi. Eğitim ve Bilim, 36(161).
  • Bilican, S., Demirtaşlı, R. N. & Kilmen, S. (2011). The attitudes and opinions of the students towards mathematics course: The comparison of TIMSS 1999 and TIMSS 2007. Educational Sciences: Theory and Practice, 11(3), 1277-1283.
  • Boyd, D., Grossman, P., Lankford, H., Loeb, S. & Wyckoff, J. (2005). How changes in entry requirements alter the teacher workforce and affect student achievement. (Working Paper No. 11844). Cambridge, MA: National Bureau of Economic Research.
  • Clotfelter, C. T., Ladd, H. F. & Vigdor, J. L. (2007). Teacher credentials and student achievement: Longitudinal analysis with student fixed effects. Economics of Education Review, 26(6), 673-682.
  • Clotfelter, C. T., Ladd, H. F. & Vigdor, J. L. (2010). Teacher credentials and student achievement in high school a cross-subject analysis with student fixed effects. Journal of Human Resources, 45(3), 655-681.
  • Demir, I., Kılıç, S. & Ünal, H. (2010). Effects of students’ and schools’ characteristics on mathematics achievement: Findings from PISA 2006. Procedia Social and Behavioral Sciences, 2(2010), 3099-3103.
  • Dinçer, M. A. & Kolasin, G. U. (2009). Türkiye’de öğrenci başarısında eşitsizliğin belirleyicileri. İstanbul: Sabancı Üniversitesi Eğitim Girişimi Reformu.
  • Doğan, N. & Barış, F. (2010). Tutum, değer ve özyeterlik değişkenlerinin TİMSS-1999 ve TİMSS-2007 sınavlarında öğrencilerin matematik başarılarını yordama düzeyleri. Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi, 1(1), 44-50.
  • Else-Quest, N. M., Hyde, J. S. & Linn, M. C. (2010). Cross-national patterns of gender differences in mathematics: A meta-analysis. Psychological Bulletin, 136(1), 101–127.
  • Embretson, S. E. & Reise, S. P. (2013). Item response theory. Psychology Press.
  • Engin-Demir, C. (2009). Factors influencing the academic achievement of the Turkish urban poor. International Journal of Educational Development, 29(2009), 17-29.
  • Erkan, S. S. S. (2013). A comparison of the education systems in Turkey and Singapore and 1999–2011 TIMSS tests results. Procedia-Social and Behavioral Sciences, 106 (2013), 55-64.
  • Friedman, L. (1996). Meta-analysis and quantitative gender differences: Reconciliation. Focus on Learning Problems in Mathematics, 18(3), 123-128.
  • Goldhaber, D. & Anthony, E. (2007). Can teacher quality be effectively assessed? National board certification as a signal of effective teaching. The Review of Economics and Statistics, 89(1), 134-150.
  • Gronmo, L. S., Linquist, M., Arora, A. & Mullis, I. V. S. (2013). TIMSS 2015 mathematics framework. In Mullis, I. V. S., Martin, M. O. (Eds.). TIMSS 2015 assessment frameworks (pp. 11-27). Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • Güner, N., Sezer, R. & İspir, O. A. (2013). İlköğretim ikinci kademe öğretmenlerinin TIMSS hakkındaki görüşleri. Pamukkale Üniversitesi Eğitim Fakültesi Dergisi, 33(1), 11-29.
  • Gürsakal, S. (2012). An evaluation of PISA 2009 student achievement levels’ affecting factors. Süleyman Demirel Üniversitesi İktisadi ve İdari Bilimler Dergisi, 17(1), 441-452.
  • Hair, J. F., Black, W. C., Babin, B. J. & Anderson, R. (2010). Multivariate analysis (7th ed.). Pearson Prentice Hall.
  • Hambleton, R. K. & Jones, R. W. (1993). Comparison of classical test theory and item response theory and their applications to test development. Educational Measurement: Issues and Practice, 12(3), 38-47.
  • Hill, H. C., Rowan, B. & Ball, D. L. (2005). Effects of teachers’ mathematical knowledge for teaching on student achievement. American educational research journal, 42(2), 371-406.
  • Hooper, D., Coughlan, J. & Mullen, M. R. (2008). Equation modelling: Guidelines for determining model fit. Electronic Journal of Business Research Methods, 6(1), 53-60.
  • İncikabı, L. (2012). After the reform in Turkey: A content analysis of SBS and TIMSS assessment in terms of mathematics content, cognitive domains, and item types. Education as Change, 16(2), 301-312.
  • Işıksal, M. & Aşkar, P. (2005). The effect of spreadsheet and dynamic geometry software on the achievement and self-efficacy of 7th-grade students. Educational Research, 47(3), 333-350.
  • Jacob, B. A. & Lefgren, L. (2002). The impact of teacher training on student achievement: Quasi-experimental evidence from school reform efforts in Chicago. (Working Paper No. 8916). Cambridge, MA: The National Bureau of Economic Research.
  • Jöreskog, K. G. & Sörbom, D. (1996). LISREL 8: User's reference guide. Scientific Software International.
  • Kan, A., Sünbül, Ö. & Ömür, S. (2013). 6.-8. sınıf seviye belirleme sınavları alt testlerinin çeşitli yöntemlere göre değişen madde fonksiyonlarının incelenmesi. Mersin Üniversitesi Eğitim Fakültesi Dergisi, 9(2), 207-222.
  • Karakaya, I. (2012). An investigation of item bias in science and technology subtests and mathematic subtests in level determination exam (LDE). Educational Sciences: Theory and Practice, 12(1), 222-229.
  • Kelecioğlu, H., Karabay, B. & Karabay, E. (2014). Investigation of placement test in terms of item biasness. Elementary Education Online, 13(3), 934-953.
  • Kılıç, S., Çene, E. & Demir, I. (2012). Comparison of learning strategies for mathematics achievement in Turkey with eight countries. Educational Sciences, 12(4), 2594-2598.
  • Lane, S., Wang, N. & Magone, M. (1996). Gender-related differential item functioning on a middle-school mathematics performance assessment. Educational Measurement: Issues and Practice, 15(4), 21-27.
  • Magis, D., Beland, S., Tuerlinckx, F. & De Boeck, P. (2010). A general framework and an R package for the detection of dichotomous differential item functioning. Behavior Rsearch Methods, 42(3), 847-862.
  • Martin, M. O., Mullis, I. V. & Foy, P. (2008). TIMSS 2007 international mathematics report: Findings from IEA's Trends in International Mathematics and Science Study at the fourth and eighth grades. Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • Martin, M. O., Mullis, I. V. S. & Foy, P. (2013). TIMSS 2015 assessment design. In Mullis, I. V. S., Martin, M. O. (Eds.). TIMSS 2015 assessment frameworks (pp. 85-99). Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • Martin, M. O., Mullis, I. V. S. & Foy, P. (2017). TIMSS 2019 assessment design. In Mullis, I. V. S., & Martin, M. O. (Eds.). TIMSS 2019 assessment frameworks (pp. 81-91). Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • McGraw, R., Lubienski, S. T. & Strutchens, M. E. (2006). A closer look at gender in NAEP mathematics achievement and affect data: Intersections with achievement, race/ethnicity, and socioeconomic status. Journal for Research in Mathematics Education, 37(2), 129-150.
  • Mullis, I. V., Martin, M. O., Foy, P. & Arora, A. (2012). TIMSS 2011 International results in mathematics. International Association for the Evaluation of Educational Achievement. Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • Mullis, I. V. S., Martin, M. O., Foy, P. & Hooper, M. (2016). TIMSS 2015 International Results in Mathematics. Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • Mullis, I. V. S., Martin, M. O., Foy, P., Kelly, D. L. & Fishbein, B. (2020). TIMSS 2019 International Results in Mathematics and Science. Retrieved from Boston College, TIMSS & PIRLS International Study Center website: https://timssandpirls.bc.edu/timss2019/international-results/
  • NCES. (2021). International Data Explorer. Accessed via: https://nces.ed.gov/timss/idetimss/ Nye, B., Konstantopoulos, S. & Hedges, L. V. (2004). How large are teacher effects? Educational Evaluation and Policy Analysis, 26(3), 237-257.
  • Osterlind, S. J. & Everson, H. T. (2009). Differential item functioning. Sage Publications. Rosseel, Y. (2012). “lavaan: An R Package for Structural Equation Modeling.” Journal of Statistical Software, 48(2), 1–36. https://www.jstatsoft.org/v48/i02/.
  • Rowan, B., Chiang, F. S. & Miller, R. J. (1997). Using research on employees' performance to study the effects of teachers on students' achievement. Sociology of Education, 70(4), 256-84.
  • Ryan, K. E. & Chiu, S. (1996). Detecting DIF on mathematics items: The case for gender and calculator sensitivity. Paper presented at the annual meeting of the American Education Research Association, New York, NY.
  • Shealy, R. & Stout, W. (1993). A model-based standardization approach that separates true bias/DIF from group ability differences and detects test bias/DTF as well as item bias/DIF. Psychometrika, 58(2), 159-194.
  • Stout, W. & Roussos, L. (1996). DIF-pack SIBTEST program [Open source computer software].
  • Stronge, J. H., Ward, T. J. & Grant, L. W. (2011). What makes good teachers good? A cross-case analysis of the connection between teacher effectiveness and student achievement. Journal of Teacher Education, 62(4), 339-355.
  • Yıldırım, H. H., Yıldırım, S., Ceylan, E., Yetişir, M. İ. & Ajans, C. (2013). Türkiye Perspektifinden TIMSS 2011 Sonuçları. Pelin Ofset: Ankara, Turkey.
  • Zumbo, B. D. (1999). A Handbook on the Theory and Methods of Differential Item Functioning (DIF): Logistic Regression Modeling as a Unitary Framework for Binary and Likert-type (Ordinal) Item Scores. Ottawa, ON: Directorate of Human Resources Research and Evaluation, Department of National Defense.

Investigation of Gender Bias in TIMSS 2015 and 2019 Mathematics Items in Turkey: Differential Item Functioning Analysis with the SIBTEST Procedure

Year 2022, Volume: 23 Issue: Özel Sayı, 211 - 258, 26.03.2022

Abstract

The aim of this study is to determine whether 8th grade mathematics questions in TIMSS international assessment (f=483), 224 in the TIMSS 2015 and 259 in the TIMSS 2019, have item bias between male and female students in Turkey, and if any, to carry out detailed investigations based on TIMSS conceptual framework. The sample of the study consists of a total of 5,080 students, including 3,058 8th graders (ngirls= 1,577, nboys= 1,481) in TIMSS 2015, and 2,022 8th graders (ngirls= 1,027, nboys= 995) in TIMSS 2019. After the Confirmatory Factor Analysis was performed for the mathematics booklets used in two assessments as a prerequisite for the analysis of the data, questions indicating bias were determined using the SIBTEST procedure, which is a special analysis method for the Differential Item Functioning (DIF) statistical technique. As a result of the analysis, it was concluded that 25 out of 224 mathematics questions in TIMSS 2015 and 15 out of 259 mathematics questions in TIMSS 2019 exam had item bias based on genders. The distribution of these biased questions by gender and finally by content and cognitive domains within the TIMSS conceptual framework is presented. The findings were discussed in line with the literature and relevant suggestions were also provided.

References

  • Aaronson, D., Barrow, L. & Sander, W. (2007). Teachers and student achievement in the Chicago public high schools. Journal of Labor Economics, 25(1), 95-135.
  • Aksu, M. (2001). Student performance in dealing with fractions. The Journal of Educational Research, 90(6), 375-380.
  • Akyüz, G. (2006). Investigation of the effect of teacher and class characteristics on mathematics achievement in Turkey and European Union countries. Elementary Education Online, 5(2), 75-86.
  • Akyüz, G. (2014). The effects of student and school factors on mathematics achievement in TIMSS 2011. Egitim ve Bilim, 39(172), 150-162.
  • Akyüz, G. & Berberoğlu, G. (2010). Teacher and classroom characteristics and their relations to mathematics achievement of the students in the TIMSS. New Horizons in Education, 58(1), 77-95.
  • Alacacı, C. & Erbaş, A. K. (2010). Unpacking the inequality among Turkish schools: Findings from PISA 2006. International Journal of Educational Development, 30(2), 182-192.
  • Atar, B. (2011). Application of descriptive and explanatory item response models to TIMSS 2007 Turkey mathematics data. Egitim ve Bilim, 36(159), 255-269.
  • Awuor, R. A. (2008). Effect of unequal sample sizes on the power of DIF detection: An IRT-based Monte Carlo study with SIBTEST and Mantel-Haenszel procedures (Doctoral dissertation, Virginia Polytechnic Institute and State University).
  • Bakan-Kalaycıoğlu, D. & Kelecioğlu, H. (2011). Öğrenci Seçme Sınavı’nın madde yanlılığı açısından incelenmesi. Eğitim ve Bilim, 36(161).
  • Bilican, S., Demirtaşlı, R. N. & Kilmen, S. (2011). The attitudes and opinions of the students towards mathematics course: The comparison of TIMSS 1999 and TIMSS 2007. Educational Sciences: Theory and Practice, 11(3), 1277-1283.
  • Boyd, D., Grossman, P., Lankford, H., Loeb, S. & Wyckoff, J. (2005). How changes in entry requirements alter the teacher workforce and affect student achievement. (Working Paper No. 11844). Cambridge, MA: National Bureau of Economic Research.
  • Clotfelter, C. T., Ladd, H. F. & Vigdor, J. L. (2007). Teacher credentials and student achievement: Longitudinal analysis with student fixed effects. Economics of Education Review, 26(6), 673-682.
  • Clotfelter, C. T., Ladd, H. F. & Vigdor, J. L. (2010). Teacher credentials and student achievement in high school a cross-subject analysis with student fixed effects. Journal of Human Resources, 45(3), 655-681.
  • Demir, I., Kılıç, S. & Ünal, H. (2010). Effects of students’ and schools’ characteristics on mathematics achievement: Findings from PISA 2006. Procedia Social and Behavioral Sciences, 2(2010), 3099-3103.
  • Dinçer, M. A. & Kolasin, G. U. (2009). Türkiye’de öğrenci başarısında eşitsizliğin belirleyicileri. İstanbul: Sabancı Üniversitesi Eğitim Girişimi Reformu.
  • Doğan, N. & Barış, F. (2010). Tutum, değer ve özyeterlik değişkenlerinin TİMSS-1999 ve TİMSS-2007 sınavlarında öğrencilerin matematik başarılarını yordama düzeyleri. Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi, 1(1), 44-50.
  • Else-Quest, N. M., Hyde, J. S. & Linn, M. C. (2010). Cross-national patterns of gender differences in mathematics: A meta-analysis. Psychological Bulletin, 136(1), 101–127.
  • Embretson, S. E. & Reise, S. P. (2013). Item response theory. Psychology Press.
  • Engin-Demir, C. (2009). Factors influencing the academic achievement of the Turkish urban poor. International Journal of Educational Development, 29(2009), 17-29.
  • Erkan, S. S. S. (2013). A comparison of the education systems in Turkey and Singapore and 1999–2011 TIMSS tests results. Procedia-Social and Behavioral Sciences, 106 (2013), 55-64.
  • Friedman, L. (1996). Meta-analysis and quantitative gender differences: Reconciliation. Focus on Learning Problems in Mathematics, 18(3), 123-128.
  • Goldhaber, D. & Anthony, E. (2007). Can teacher quality be effectively assessed? National board certification as a signal of effective teaching. The Review of Economics and Statistics, 89(1), 134-150.
  • Gronmo, L. S., Linquist, M., Arora, A. & Mullis, I. V. S. (2013). TIMSS 2015 mathematics framework. In Mullis, I. V. S., Martin, M. O. (Eds.). TIMSS 2015 assessment frameworks (pp. 11-27). Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • Güner, N., Sezer, R. & İspir, O. A. (2013). İlköğretim ikinci kademe öğretmenlerinin TIMSS hakkındaki görüşleri. Pamukkale Üniversitesi Eğitim Fakültesi Dergisi, 33(1), 11-29.
  • Gürsakal, S. (2012). An evaluation of PISA 2009 student achievement levels’ affecting factors. Süleyman Demirel Üniversitesi İktisadi ve İdari Bilimler Dergisi, 17(1), 441-452.
  • Hair, J. F., Black, W. C., Babin, B. J. & Anderson, R. (2010). Multivariate analysis (7th ed.). Pearson Prentice Hall.
  • Hambleton, R. K. & Jones, R. W. (1993). Comparison of classical test theory and item response theory and their applications to test development. Educational Measurement: Issues and Practice, 12(3), 38-47.
  • Hill, H. C., Rowan, B. & Ball, D. L. (2005). Effects of teachers’ mathematical knowledge for teaching on student achievement. American educational research journal, 42(2), 371-406.
  • Hooper, D., Coughlan, J. & Mullen, M. R. (2008). Equation modelling: Guidelines for determining model fit. Electronic Journal of Business Research Methods, 6(1), 53-60.
  • İncikabı, L. (2012). After the reform in Turkey: A content analysis of SBS and TIMSS assessment in terms of mathematics content, cognitive domains, and item types. Education as Change, 16(2), 301-312.
  • Işıksal, M. & Aşkar, P. (2005). The effect of spreadsheet and dynamic geometry software on the achievement and self-efficacy of 7th-grade students. Educational Research, 47(3), 333-350.
  • Jacob, B. A. & Lefgren, L. (2002). The impact of teacher training on student achievement: Quasi-experimental evidence from school reform efforts in Chicago. (Working Paper No. 8916). Cambridge, MA: The National Bureau of Economic Research.
  • Jöreskog, K. G. & Sörbom, D. (1996). LISREL 8: User's reference guide. Scientific Software International.
  • Kan, A., Sünbül, Ö. & Ömür, S. (2013). 6.-8. sınıf seviye belirleme sınavları alt testlerinin çeşitli yöntemlere göre değişen madde fonksiyonlarının incelenmesi. Mersin Üniversitesi Eğitim Fakültesi Dergisi, 9(2), 207-222.
  • Karakaya, I. (2012). An investigation of item bias in science and technology subtests and mathematic subtests in level determination exam (LDE). Educational Sciences: Theory and Practice, 12(1), 222-229.
  • Kelecioğlu, H., Karabay, B. & Karabay, E. (2014). Investigation of placement test in terms of item biasness. Elementary Education Online, 13(3), 934-953.
  • Kılıç, S., Çene, E. & Demir, I. (2012). Comparison of learning strategies for mathematics achievement in Turkey with eight countries. Educational Sciences, 12(4), 2594-2598.
  • Lane, S., Wang, N. & Magone, M. (1996). Gender-related differential item functioning on a middle-school mathematics performance assessment. Educational Measurement: Issues and Practice, 15(4), 21-27.
  • Magis, D., Beland, S., Tuerlinckx, F. & De Boeck, P. (2010). A general framework and an R package for the detection of dichotomous differential item functioning. Behavior Rsearch Methods, 42(3), 847-862.
  • Martin, M. O., Mullis, I. V. & Foy, P. (2008). TIMSS 2007 international mathematics report: Findings from IEA's Trends in International Mathematics and Science Study at the fourth and eighth grades. Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • Martin, M. O., Mullis, I. V. S. & Foy, P. (2013). TIMSS 2015 assessment design. In Mullis, I. V. S., Martin, M. O. (Eds.). TIMSS 2015 assessment frameworks (pp. 85-99). Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • Martin, M. O., Mullis, I. V. S. & Foy, P. (2017). TIMSS 2019 assessment design. In Mullis, I. V. S., & Martin, M. O. (Eds.). TIMSS 2019 assessment frameworks (pp. 81-91). Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • McGraw, R., Lubienski, S. T. & Strutchens, M. E. (2006). A closer look at gender in NAEP mathematics achievement and affect data: Intersections with achievement, race/ethnicity, and socioeconomic status. Journal for Research in Mathematics Education, 37(2), 129-150.
  • Mullis, I. V., Martin, M. O., Foy, P. & Arora, A. (2012). TIMSS 2011 International results in mathematics. International Association for the Evaluation of Educational Achievement. Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • Mullis, I. V. S., Martin, M. O., Foy, P. & Hooper, M. (2016). TIMSS 2015 International Results in Mathematics. Chestnut Hill, MA: TIMSS & PIRLS International Study Center at Boston College.
  • Mullis, I. V. S., Martin, M. O., Foy, P., Kelly, D. L. & Fishbein, B. (2020). TIMSS 2019 International Results in Mathematics and Science. Retrieved from Boston College, TIMSS & PIRLS International Study Center website: https://timssandpirls.bc.edu/timss2019/international-results/
  • NCES. (2021). International Data Explorer. Accessed via: https://nces.ed.gov/timss/idetimss/ Nye, B., Konstantopoulos, S. & Hedges, L. V. (2004). How large are teacher effects? Educational Evaluation and Policy Analysis, 26(3), 237-257.
  • Osterlind, S. J. & Everson, H. T. (2009). Differential item functioning. Sage Publications. Rosseel, Y. (2012). “lavaan: An R Package for Structural Equation Modeling.” Journal of Statistical Software, 48(2), 1–36. https://www.jstatsoft.org/v48/i02/.
  • Rowan, B., Chiang, F. S. & Miller, R. J. (1997). Using research on employees' performance to study the effects of teachers on students' achievement. Sociology of Education, 70(4), 256-84.
  • Ryan, K. E. & Chiu, S. (1996). Detecting DIF on mathematics items: The case for gender and calculator sensitivity. Paper presented at the annual meeting of the American Education Research Association, New York, NY.
  • Shealy, R. & Stout, W. (1993). A model-based standardization approach that separates true bias/DIF from group ability differences and detects test bias/DTF as well as item bias/DIF. Psychometrika, 58(2), 159-194.
  • Stout, W. & Roussos, L. (1996). DIF-pack SIBTEST program [Open source computer software].
  • Stronge, J. H., Ward, T. J. & Grant, L. W. (2011). What makes good teachers good? A cross-case analysis of the connection between teacher effectiveness and student achievement. Journal of Teacher Education, 62(4), 339-355.
  • Yıldırım, H. H., Yıldırım, S., Ceylan, E., Yetişir, M. İ. & Ajans, C. (2013). Türkiye Perspektifinden TIMSS 2011 Sonuçları. Pelin Ofset: Ankara, Turkey.
  • Zumbo, B. D. (1999). A Handbook on the Theory and Methods of Differential Item Functioning (DIF): Logistic Regression Modeling as a Unitary Framework for Binary and Likert-type (Ordinal) Item Scores. Ottawa, ON: Directorate of Human Resources Research and Evaluation, Department of National Defense.
There are 55 citations in total.

Details

Primary Language Turkish
Subjects Other Fields of Education
Journal Section Research Articles
Authors

Musa Sadak 0000-0001-6036-1279

Publication Date March 26, 2022
Published in Issue Year 2022 Volume: 23 Issue: Özel Sayı

Cite

APA Sadak, M. (2022). TIMSS 2015 ve 2019 Matematik Sorularının Türkiye’de Cinsiyete Göre Madde Yanlılığının İncelenmesi: SIBTEST Prosedürü ile Değişen Madde Fonksiyonu Analizi. Ahi Evran Üniversitesi Kırşehir Eğitim Fakültesi Dergisi, 23(Özel Sayı), 211-258. https://doi.org/10.29299/kefad.961858

2562219122   19121   19116   19117     19118       19119       19120     19124