Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2020, Cilt: 7 Sayı: 2, 47 - 59, 01.08.2020
https://doi.org/10.17275/per.20.19.7.2

Öz

Kaynakça

  • Büyüköztürk, Ş. (2010). Sosyal bilimler için veri analizi el kitabı [Handbook of data analysis for social sciences]. Ankara: Pegem.
  • Büyüköztürk, B., Çokluk, Ö., & Köklü, N. (2011). Sosyal bilimler için istatistik [Statistics for social sciences]. Ankara: Pegem Academy.
  • Çıkrıkçı Demirtaşlı, N. (2010). Açık uçlu soru formatı ve öğrenci izleme sistemi (ÖİS) akademik gelişimi izleme ve değerlendirme (AGİD) modülündeki kullanımı [Open-ended question format and its use in the module of student monitoring system (SMS) for monitoring and evaluating academic development (MEAD)]. Cito Education Theory and Practice, 8, 22–30. Retrieved from http://www.ogretmen.info/makale/33.pdf.
  • Delaram, M., & Safiri, A. (2014). A comparison of students’ scores in multiple-choice and essay questions in mother and child health examination. Future of Medical Education Journal, 4(2), 15–18. https://doi.org/10.22038/fmej.2014.2598
  • Delil, H. (2006). An analysis of geometry problems in 6 - 8 grades Turkish mathematics textbooks. (Unpublished Master Thesis), Middle East Technical University, Institute of Social Sciences, Secondary Science and Mathematics Education, Mathematics Education, Ankara.
  • Demir, E. (2010). Uluslararası öğrenci değerlendirme programı (PISA) bilişsel alan testlerinde yer alan soru tiplerine göre Türkiye’de öğrenci başarıları [The students’ achievement in Turkey, according to the question types used in program for international student assessment (PISA) cognitive domain tests]. (Unpublished Master Thesis), Hacettepe University, Institute of Social Sciences, Educational Sciences, Measurement and Evaluation, Ankara.
  • Gayef, A., Oner, C., & Telatar, B. (2014). Is asking same question in different ways has any impact on student achievement? Procedia - Social and Behavioral Sciences, 152, 339–342. https://doi.org/10.1016/j.sbspro.2014.09.206
  • Hastedt, D., & Sibberns, H. (2005). Differences between multiple choice items and constructed response items in the LEA TIMSS surveys. Studies in Educational Evaluation, 31, 145–161. https://doi.org/10.1016/j.stueduc.2005.05.007
  • İlhan, M. (2019). Madde analizi ve madde ile test istatistikleri arasındaki ilişki [Item analysis and relationship between item and test statistics]. In B. Çetin (Eds.), Eğitimde ölçme ve değerlendirme [Measurement and evaluation in education] (s. 269–298). Ankara: Anı.
  • Kibble, J. D., & Johnson, T. (2011). Are faculty predictions or item taxonomies useful for estimating the outcome of multiple-choice examinations? Advances in Physiology Education, 35(4), 396–401. https://doi.org/10.1152/advan.00062.2011
  • Kim, M. K., Patel, R. A., Uchizono, J. A., & Beck, L. (2012). Incorporation of Bloom’s taxonomy into multiplechoice examination questions for a pharmacotherapeutics course. American Journal of Pharmaceutical Education, 76(6), 1–8. https://doi.org/10.5688/ajpe766114
  • Koçdar, S., Karadağ, N., & Şahin, M. D. (2016). Analysis of the difficulty and discrimination ındices of multiple-choice questions according to cognitive levels in an open and distance learning context. The Turkish Online Journal of Educational Technology, 15(4), 16–24. Retrieved from https://files.eric.ed.gov/fulltext/EJ1117619.pdf
  • Milli Eğitim Bakanlığı [Ministry of National Education]. (2015). TIMSS 2015 ulusal matematik ve fen bilimleri ön raporu 4. ve 8. Sınıflar [National mathematics and science preliminary report of grade 4 and 8 for TIMSS 2015]. Retrieved from http://timss.meb.gov.tr/wp-content/uploads/TIMSS_2015_Ulusal_Rapor.pdf.
  • Mullis, I. V. S. & Martin, M. O. (Eds.). (2013). TIMSS 2015 assessment frameworks. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.
  • Momsen, J., Offerdahl, E., Kryjevskaia, M., Montplaisir, L., Anderson, E., & Grosz, N. (2013). Using assessments to investigate and compare the nature of learning in undergraduate science courses. CBE—Life Sciences Education, 12(2), 239–249. https://doi.org/10.1187/cbe.12-08-0130
  • Nehm, R. H., & Schonfeld, M. (2008). Item feature effects in evolution assessment. Journal of Research in Science Teaching, 48(3), 237–256. https://doi.org/10.1002/tea.20400
  • Nevid, J. S., & McClelland, N. (2013). Using action verbs as learning outcomes: Applying Bloom’s taxonomy in measuring instructional objectives in introductory psychology. Journal of Education and Training Studies, 1(2), 19–24. http://dx.doi.org/10.11114/jets.v1i2.94
  • O'Leary, M. (2001). Item format as a factor affecting the relative standing of countries in the third international mathematics and science study (TIMSS). Irish Educational Studies, 20(1), 153–177, https://doi.org/10.1080/0332331010200114
  • Öksüz, Y., & Güven Demir, E. (2019). Açık uçlu ve çoktan seçmeli başarı testlerinin psikometrik özellikleri ve öğrenci performansı açısından karşılaştırılması [Comparison of open ended questions and multiple choice tests in terms of psychometric features and student performance]. Hacettepe University Journal of Education, 34(1), 259–282. https://dx.doi.org/10.16986/HUJE.2018040550
  • Özer Özkan, Y., & Özaslan, N. (2018). Student achievement in Turkey, according to question types used in PISA 2003-2012 mathematic literacy tests. International Journal of Evaluation and Research in Education (IJERE), 7(1), 57–64. Retrieved from https://files.eric.ed.gov/fulltext/EJ1174883.pdf
  • Pallant, J. (2005). SPSS survival manual: A step by step guide to data analysis using SPSS for windows (Version 12). New York: Open University Press.
  • Pepple, D. J., Young, L. E., Carroll, R. G. (2009). A comparison of student performance in multiple-choice and long essay questions in the MBBS stage I physiology examination at the University of the West Indies (Mona Campus). Advances in Physiology Education, 34(2), 86–89. https://dx.doi.org/10.1152/advan.00087.2009
  • Tan, Ş. (2016). SPSS ve Excel uygulamalı temel istatistik–I [Basic statistics-I with SPSS and Excel]. Ankara: Pegem.
  • Temel, S., Dinçol Özgür, S., & Yılmaz, A. (2012). The effect of different types of test on preservice chemistry teachers’ achievement related to “chemical bondıng”. Problems of Education in the 21st century, 41, 123–129. Retrieved from http://oaji.net/articles/2014/457-1409840433.pdf
  • Temizkan, M., & Sallabaş, M. E. (2011). Okuduğunu anlama becerisinin değerlendirilmesinde çoktan seçmeli testlerle açık uçlu yazılı yoklamaların karşılaştırılması [Comparing multiple choice tests and open ended questions while evaluating reading comprehension skills]. Dumlupınar University Journal of Social Sciences, 30, 207–220. Retrieved from https://birimler.dpu.edu.tr/app/views/panel/ckfinder/userfiles/17/files/DERG_/30/207-220.pdf.
  • Thawabieh, A. M. (2016). Comparison between two test item formats: Multiple-choice items and completion items. British Journal of Education, 4(8), 63–74. Retrieved from http://www.eajournals.org/wp-content/uploads/A-Comparison-between-Two-Test-Item-Formats-Multiple-Choice-Items-and-Completion-Items.pdf
  • Veeravagu, J., Muthusamy, C., Marimuthu, R., & Subrayan, A. (2010). Using Bloom’s taxonomy to gauge students’ reading comprehension performance. Canadian Social Science, 6(3), 205–212. https://dx.doi.org/10.3968/j.css.1923669720100603.023
  • Yılmaz Koğar, E. & Koğar, H. (2019). Investigation of scientific literacy according to different item types: PISA 2015 Turkey sample. Bolu Abant İzzet Baysal University Journal of Faculty of Education, 19(2), 695–709. https://dx.doi.org/10.17240/aibuefd.20xx.xx.xxxxx-xxxxxx

The Effect of the Item’s Type and Cognitive Level on Its Difficulty Index: The Sample of TIMSS 2015

Yıl 2020, Cilt: 7 Sayı: 2, 47 - 59, 01.08.2020
https://doi.org/10.17275/per.20.19.7.2

Öz

In this research, the effect of an item’s type and cognitive level on its difficulty index was investigated. The data source of the study consisted of the responses of the 12535 students in the Turkey sample (6079 and 6456 students from eighth and fourth grade respectively) of TIMSS 2015. The responses were a total of 215 items at the eighth-grade level (115 multiple-choice and 100 open-ended) and 178 items at the fourth grade level (93 multiple-choice and 85 open-ended) in the mathematics test. In the present study, the difficulty indices for the items of different types and cognitive level were calculated firstly. Then, the item type was defined as a dummy variable and multiple regression analysis was applied to test for the predictive effect of the type and cognitive level of the items on its difficulty index. In the study, it was determined that both the type and the cognitive level of the item had a statistically significant effect on its difficulty index. Compared to multiple choice items, it was detected that students had more difficulty in open-ended items. The effect of item type on difficulty index was found to be higher in eighth grade items than the fourth-grade ones. It was ascertained that difficulty index moved toward zero as its cognitive level increased at both grade levels. However, the effect of cognitive level on the difficulty index was higher in the fourth-grade items compared to the eighth grade items.

Kaynakça

  • Büyüköztürk, Ş. (2010). Sosyal bilimler için veri analizi el kitabı [Handbook of data analysis for social sciences]. Ankara: Pegem.
  • Büyüköztürk, B., Çokluk, Ö., & Köklü, N. (2011). Sosyal bilimler için istatistik [Statistics for social sciences]. Ankara: Pegem Academy.
  • Çıkrıkçı Demirtaşlı, N. (2010). Açık uçlu soru formatı ve öğrenci izleme sistemi (ÖİS) akademik gelişimi izleme ve değerlendirme (AGİD) modülündeki kullanımı [Open-ended question format and its use in the module of student monitoring system (SMS) for monitoring and evaluating academic development (MEAD)]. Cito Education Theory and Practice, 8, 22–30. Retrieved from http://www.ogretmen.info/makale/33.pdf.
  • Delaram, M., & Safiri, A. (2014). A comparison of students’ scores in multiple-choice and essay questions in mother and child health examination. Future of Medical Education Journal, 4(2), 15–18. https://doi.org/10.22038/fmej.2014.2598
  • Delil, H. (2006). An analysis of geometry problems in 6 - 8 grades Turkish mathematics textbooks. (Unpublished Master Thesis), Middle East Technical University, Institute of Social Sciences, Secondary Science and Mathematics Education, Mathematics Education, Ankara.
  • Demir, E. (2010). Uluslararası öğrenci değerlendirme programı (PISA) bilişsel alan testlerinde yer alan soru tiplerine göre Türkiye’de öğrenci başarıları [The students’ achievement in Turkey, according to the question types used in program for international student assessment (PISA) cognitive domain tests]. (Unpublished Master Thesis), Hacettepe University, Institute of Social Sciences, Educational Sciences, Measurement and Evaluation, Ankara.
  • Gayef, A., Oner, C., & Telatar, B. (2014). Is asking same question in different ways has any impact on student achievement? Procedia - Social and Behavioral Sciences, 152, 339–342. https://doi.org/10.1016/j.sbspro.2014.09.206
  • Hastedt, D., & Sibberns, H. (2005). Differences between multiple choice items and constructed response items in the LEA TIMSS surveys. Studies in Educational Evaluation, 31, 145–161. https://doi.org/10.1016/j.stueduc.2005.05.007
  • İlhan, M. (2019). Madde analizi ve madde ile test istatistikleri arasındaki ilişki [Item analysis and relationship between item and test statistics]. In B. Çetin (Eds.), Eğitimde ölçme ve değerlendirme [Measurement and evaluation in education] (s. 269–298). Ankara: Anı.
  • Kibble, J. D., & Johnson, T. (2011). Are faculty predictions or item taxonomies useful for estimating the outcome of multiple-choice examinations? Advances in Physiology Education, 35(4), 396–401. https://doi.org/10.1152/advan.00062.2011
  • Kim, M. K., Patel, R. A., Uchizono, J. A., & Beck, L. (2012). Incorporation of Bloom’s taxonomy into multiplechoice examination questions for a pharmacotherapeutics course. American Journal of Pharmaceutical Education, 76(6), 1–8. https://doi.org/10.5688/ajpe766114
  • Koçdar, S., Karadağ, N., & Şahin, M. D. (2016). Analysis of the difficulty and discrimination ındices of multiple-choice questions according to cognitive levels in an open and distance learning context. The Turkish Online Journal of Educational Technology, 15(4), 16–24. Retrieved from https://files.eric.ed.gov/fulltext/EJ1117619.pdf
  • Milli Eğitim Bakanlığı [Ministry of National Education]. (2015). TIMSS 2015 ulusal matematik ve fen bilimleri ön raporu 4. ve 8. Sınıflar [National mathematics and science preliminary report of grade 4 and 8 for TIMSS 2015]. Retrieved from http://timss.meb.gov.tr/wp-content/uploads/TIMSS_2015_Ulusal_Rapor.pdf.
  • Mullis, I. V. S. & Martin, M. O. (Eds.). (2013). TIMSS 2015 assessment frameworks. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.
  • Momsen, J., Offerdahl, E., Kryjevskaia, M., Montplaisir, L., Anderson, E., & Grosz, N. (2013). Using assessments to investigate and compare the nature of learning in undergraduate science courses. CBE—Life Sciences Education, 12(2), 239–249. https://doi.org/10.1187/cbe.12-08-0130
  • Nehm, R. H., & Schonfeld, M. (2008). Item feature effects in evolution assessment. Journal of Research in Science Teaching, 48(3), 237–256. https://doi.org/10.1002/tea.20400
  • Nevid, J. S., & McClelland, N. (2013). Using action verbs as learning outcomes: Applying Bloom’s taxonomy in measuring instructional objectives in introductory psychology. Journal of Education and Training Studies, 1(2), 19–24. http://dx.doi.org/10.11114/jets.v1i2.94
  • O'Leary, M. (2001). Item format as a factor affecting the relative standing of countries in the third international mathematics and science study (TIMSS). Irish Educational Studies, 20(1), 153–177, https://doi.org/10.1080/0332331010200114
  • Öksüz, Y., & Güven Demir, E. (2019). Açık uçlu ve çoktan seçmeli başarı testlerinin psikometrik özellikleri ve öğrenci performansı açısından karşılaştırılması [Comparison of open ended questions and multiple choice tests in terms of psychometric features and student performance]. Hacettepe University Journal of Education, 34(1), 259–282. https://dx.doi.org/10.16986/HUJE.2018040550
  • Özer Özkan, Y., & Özaslan, N. (2018). Student achievement in Turkey, according to question types used in PISA 2003-2012 mathematic literacy tests. International Journal of Evaluation and Research in Education (IJERE), 7(1), 57–64. Retrieved from https://files.eric.ed.gov/fulltext/EJ1174883.pdf
  • Pallant, J. (2005). SPSS survival manual: A step by step guide to data analysis using SPSS for windows (Version 12). New York: Open University Press.
  • Pepple, D. J., Young, L. E., Carroll, R. G. (2009). A comparison of student performance in multiple-choice and long essay questions in the MBBS stage I physiology examination at the University of the West Indies (Mona Campus). Advances in Physiology Education, 34(2), 86–89. https://dx.doi.org/10.1152/advan.00087.2009
  • Tan, Ş. (2016). SPSS ve Excel uygulamalı temel istatistik–I [Basic statistics-I with SPSS and Excel]. Ankara: Pegem.
  • Temel, S., Dinçol Özgür, S., & Yılmaz, A. (2012). The effect of different types of test on preservice chemistry teachers’ achievement related to “chemical bondıng”. Problems of Education in the 21st century, 41, 123–129. Retrieved from http://oaji.net/articles/2014/457-1409840433.pdf
  • Temizkan, M., & Sallabaş, M. E. (2011). Okuduğunu anlama becerisinin değerlendirilmesinde çoktan seçmeli testlerle açık uçlu yazılı yoklamaların karşılaştırılması [Comparing multiple choice tests and open ended questions while evaluating reading comprehension skills]. Dumlupınar University Journal of Social Sciences, 30, 207–220. Retrieved from https://birimler.dpu.edu.tr/app/views/panel/ckfinder/userfiles/17/files/DERG_/30/207-220.pdf.
  • Thawabieh, A. M. (2016). Comparison between two test item formats: Multiple-choice items and completion items. British Journal of Education, 4(8), 63–74. Retrieved from http://www.eajournals.org/wp-content/uploads/A-Comparison-between-Two-Test-Item-Formats-Multiple-Choice-Items-and-Completion-Items.pdf
  • Veeravagu, J., Muthusamy, C., Marimuthu, R., & Subrayan, A. (2010). Using Bloom’s taxonomy to gauge students’ reading comprehension performance. Canadian Social Science, 6(3), 205–212. https://dx.doi.org/10.3968/j.css.1923669720100603.023
  • Yılmaz Koğar, E. & Koğar, H. (2019). Investigation of scientific literacy according to different item types: PISA 2015 Turkey sample. Bolu Abant İzzet Baysal University Journal of Faculty of Education, 19(2), 695–709. https://dx.doi.org/10.17240/aibuefd.20xx.xx.xxxxx-xxxxxx
Toplam 28 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Alan Eğitimleri
Bölüm Research Articles
Yazarlar

Mustafa İlhan 0000-0003-1804-002X

Nagihan Boztunc Öztürk 0000-0002-2777-5311

Melek Gülşah Şahin 0000-0001-5139-9777

Yayımlanma Tarihi 1 Ağustos 2020
Kabul Tarihi 26 Mart 2020
Yayımlandığı Sayı Yıl 2020 Cilt: 7 Sayı: 2

Kaynak Göster

APA İlhan, M., Boztunc Öztürk, N., & Şahin, M. G. (2020). The Effect of the Item’s Type and Cognitive Level on Its Difficulty Index: The Sample of TIMSS 2015. Participatory Educational Research, 7(2), 47-59. https://doi.org/10.17275/per.20.19.7.2