Araştırma Makalesi
BibTex RIS Kaynak Göster

Impact of Answer-Switching Behavior on Multiple-Choice Test Scores in Higher Education

Yıl 2011, Cilt: 2 Sayı: 1, 114 - 120, 01.03.2011

Öz

The multiple- choice format is one of the most popular selected-response item formats used in educational
testing. Researchers have shown that Multiple-choice type test is a useful vehicle for student assessment in core
university subjects that usually have large student numbers. Even though the educators, test experts and different
test recourses maintain the idea that the first answer should be retained, many researchers argued that this
argument is not dependent with empirical findings. The main question of this study is to examine how the
answer switching behavior affects the multiple-choice test score. Additionally, gender differences and
relationship between number of answer switching behavior and item parameters (item difficulty and item
discrimination) were investigated. The participants in this study consisted of 207 upper-level College of
Education students from mid-sized universities. A Midterm exam consisted of 20 multiple-choice questions was
used. According to the result of this study, answer switching behavior statistically increase test scores. On the
other hand, there is no significant gender difference in answer-switching behavior. Additionally, there is a
significant negative relationship between answer switching behavior and item difficulties. 

Kaynakça

  • other faculties for example college of engineering or art and sciences may different behavior than the
  • students in college of education. The second limitation is that relying on visible eraser marks results in
  • an underestimate of answer switching. Milia (2007) pointed out that the visible eraser marks are
  • produced when students use greater effort to mark the item and concluded that other cases of answer
  • switching were not identified.
  • Benjamin, L. T., Cavell, T. A. & Shallenberger, W. R. (1984). Staying with the initial answers on objective tests: Is it a myth? Teaching of Psychology, 11, 133-141.
  • Carey, L. M. (1988). Measuring and evaluating school learning. Allyn and Bacon, Inc. Newton, Massachusetts.
  • Geiger, M. (1996). On the benefit of changing multiple-choice answers: Student perception and performance. Education, 117, 108–117.
  • Haney, W. & Madaus, G. (1989). Searching for alternatives to standardized tests; whys, whats, and whithers. Phi Delta Kappan, 70, 683 – 687.
  • Kelley, T., Ebel, R. & Linacre, J. M. (2002). Item discrimination Indices. Rasch Measurement Transactions, 16:3, p.883-4.
  • Kruger, J., Wirtz, D., & Miller, D. T. (2005). Counterfactual thinking and the first instinct fallacy. Journal of Personality and Social Psychology, 88, 725–735.
  • Linn, R. L. & Gronlund, N. E. (1995). Measurement and assessment in teaching. 7th ed. Prentice-Hall, Inc. Columbus, OH.
  • Milia, L. D. (2007) Benefiting from multiple-choice exams: the positive impact of answer switching. Educational Psychology, 27(5), 607 — 615
  • Nieswiadomy, R. N., Arnold, W. K. & Garza, C. (2001). Changing answers on multiple-choice examinations taken by baccalaureate nursing students. Journal of Nursing Education, 40,142–144.
  • Oosterhof, A. (1994). Classroom applications of educational measurement (2nd Ed.). New York, NY: Macmillan College Publishing Company.
  • Pressley, M. & Ghatala, E.S. (1988). Delusions and about performances on multiple-choice comprehension test items. Reading Research Quarterly, 23, 454-464.
  • Shepard, L. A. (1989). Why we need better assessment. Educational Leadership, 46(7) 4 – 9.
  • Struyen, K., Docht, P. & Janssens, S. (2005). Students’ perceptions about evaluation and assessment in higher education. Assessment and Evaluation in Higher Education, 30, 325–341.
  • Varma, S. (2010). Preliminary item statistics using point-biserial correlation and p values. Retrieved April 14, 2010, from, http://www.eddata.com.
  • Vispoel, W. (2000). Reviewing and changing answers on computerized fixed-item vocabulary tests. Educational and Psychological Measurement, 60, 371–384.
  • Wallace, M. A. & Williams, R. L. (2003). Multiple-choice exams: explanations for student choices. Teaching of Psychology, 30, 136–138.
  • Zakay, D. & Glicksohn, J. (1992). Overconfidence in a multiple-choice test and its relationship to achievement. The Psychological Record, 42, 519-524.

Impact of Answer-Switching Behavior on Multiple-Choice Test Scores in Higher Education

Yıl 2011, Cilt: 2 Sayı: 1, 114 - 120, 01.03.2011

Öz

Çoktan seçmeli soru formatı, eğitimle ilgili ölçme ve değerlendirme faaliyetlerinde en çok kullanılan soru formatı olarak bilinmektedir. Yapılan araştırmalar da göstermiştir ki, özellikle öğrenci sayısının çok olduğu yükseköğretimin farklı alanlarında öğrenci başarısının değerlendirilmesinde oldukça yaygın olarak kullanılan bir araç olarak görülmektedir. Eğitimcilerin, test uzmanlarının ve diğer kaynakların çoktan seçmeli sınavlarda öğrencilerin ilk verdikleri cevapların genellikle doğru cevap olduğu inancı yaygın olarak bilinmesine rağmen, bu inanç deneysel çalışmalara dayanmamaktadır. Bu araştırmamın amacı çoktan seçmeli sınavlarda cevapları değiştirme davranışının, öğrencilerin toplam puanlarına bir etkisinin olup – olmadığını ortaya çıkarmaktır. Ayrıca, cevap değiştirme davranışının cinsiyet değişkenine göre farklılık gösterip – göstermediği ve cevap değiştirme davranışları ile madde istatistikleri (madde güçlük ve ayırıcılık indeksleri) arasında bir ilişkinin olmadığı da incelenmiştir. Araştırmada çalışma grubu olarak Eğitim Fakültesine devam eden 207 öğrenci yer almaktadır. Araştırmada 20 adet çoktan seçmeli sorulardan oluşan arasınav sonuçları kullanılmıştır. Elde edilen bulgulara göre, öğrencilerin cevap değiştirme davranışlarının test puanlarını arttırdığı belirlenmiştir. Öte yandan cevap değiştirme davranışı için cinsiyet farklılığı belirlenememiştir. Ayrıca cevap değiştirme davranışı ile madde güçlük indeksi arasından ters orantılı bir ilişkinin olduğu belirlenmiştir

Kaynakça

  • other faculties for example college of engineering or art and sciences may different behavior than the
  • students in college of education. The second limitation is that relying on visible eraser marks results in
  • an underestimate of answer switching. Milia (2007) pointed out that the visible eraser marks are
  • produced when students use greater effort to mark the item and concluded that other cases of answer
  • switching were not identified.
  • Benjamin, L. T., Cavell, T. A. & Shallenberger, W. R. (1984). Staying with the initial answers on objective tests: Is it a myth? Teaching of Psychology, 11, 133-141.
  • Carey, L. M. (1988). Measuring and evaluating school learning. Allyn and Bacon, Inc. Newton, Massachusetts.
  • Geiger, M. (1996). On the benefit of changing multiple-choice answers: Student perception and performance. Education, 117, 108–117.
  • Haney, W. & Madaus, G. (1989). Searching for alternatives to standardized tests; whys, whats, and whithers. Phi Delta Kappan, 70, 683 – 687.
  • Kelley, T., Ebel, R. & Linacre, J. M. (2002). Item discrimination Indices. Rasch Measurement Transactions, 16:3, p.883-4.
  • Kruger, J., Wirtz, D., & Miller, D. T. (2005). Counterfactual thinking and the first instinct fallacy. Journal of Personality and Social Psychology, 88, 725–735.
  • Linn, R. L. & Gronlund, N. E. (1995). Measurement and assessment in teaching. 7th ed. Prentice-Hall, Inc. Columbus, OH.
  • Milia, L. D. (2007) Benefiting from multiple-choice exams: the positive impact of answer switching. Educational Psychology, 27(5), 607 — 615
  • Nieswiadomy, R. N., Arnold, W. K. & Garza, C. (2001). Changing answers on multiple-choice examinations taken by baccalaureate nursing students. Journal of Nursing Education, 40,142–144.
  • Oosterhof, A. (1994). Classroom applications of educational measurement (2nd Ed.). New York, NY: Macmillan College Publishing Company.
  • Pressley, M. & Ghatala, E.S. (1988). Delusions and about performances on multiple-choice comprehension test items. Reading Research Quarterly, 23, 454-464.
  • Shepard, L. A. (1989). Why we need better assessment. Educational Leadership, 46(7) 4 – 9.
  • Struyen, K., Docht, P. & Janssens, S. (2005). Students’ perceptions about evaluation and assessment in higher education. Assessment and Evaluation in Higher Education, 30, 325–341.
  • Varma, S. (2010). Preliminary item statistics using point-biserial correlation and p values. Retrieved April 14, 2010, from, http://www.eddata.com.
  • Vispoel, W. (2000). Reviewing and changing answers on computerized fixed-item vocabulary tests. Educational and Psychological Measurement, 60, 371–384.
  • Wallace, M. A. & Williams, R. L. (2003). Multiple-choice exams: explanations for student choices. Teaching of Psychology, 30, 136–138.
  • Zakay, D. & Glicksohn, J. (1992). Overconfidence in a multiple-choice test and its relationship to achievement. The Psychological Record, 42, 519-524.
Toplam 22 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Bölüm Makaleler
Yazarlar

Ramazan Baştürk Bu kişi benim

Yayımlanma Tarihi 1 Mart 2011
Yayımlandığı Sayı Yıl 2011 Cilt: 2 Sayı: 1

Kaynak Göster

APA Baştürk, R. (2011). Impact of Answer-Switching Behavior on Multiple-Choice Test Scores in Higher Education. Journal of Measurement and Evaluation in Education and Psychology, 2(1), 114-120.