Research Article

Scoring Methods for Multiple Choice Tests: How does the Item Difficulty Weighted Scoring Change Student’s Test Results?

Volume: 10 Number: 2 June 5, 2021
EN TR

Scoring Methods for Multiple Choice Tests: How does the Item Difficulty Weighted Scoring Change Student’s Test Results?

Abstract

In the present study, it was aimed to compare students’ test scores, item and test statistics calculated based on the unweighted (1-0) and item difficulty weighted scores (Qj - 0). The study also included a proposal for converting the weighted scores into a 100-point scale system. A teacher-made 34-item multiple-choice achievement test was conducted to a group of 431 people. As a result of the data analysis, the McDonald's Omega internal consistency coefficients that were obtained according to the 1-0 and (Q_j-0) methods were obtained as .725 and .721, respectively. The Pearson’s product moment correlation coefficient was .916, and the Spearman’s rank-order correlation coefficient was .926 between student scores obtained according to the two methods. Furthermore, a criterion-based evaluation was made based on the two criteria (test scores of 50 and 60), and the numbers of the students who were successful and unsuccessful in the course were determined according to both scoring methods. Accordingly, it was found that more students would be considered unsuccessful in the course in the (Q_j-0) scoring method; however, it was understood that this method could reveal differences among individuals more than the unweighted scoring method.

Keywords

References

  1. Akkuş, O. & Baykul, Y. (2001) Çoktan seçmeli test maddelerinin puanlamada, seçenekleri farklı biçimlerde ağırlıklandırmanın madde ve test istatistiklerine olan etkisinin incelenmesi [An investigation on the effects of different item-option scoring methods on item and test parameters ]. Hacettepe University Journal of Education, 20, 9-15.
  2. Bacon, D. R. (2003). Assessing learning outcomes: A comparison of multiple-choice and short answer questions in a marketing context. Journal of Marketing Education, 25, 31-36. doi: 10.1177/0273475302250570
  3. Bar-Hillel, M., Budescu, D., & Attali, Y. (2005). Scoring and keying multiple choice tests: A case study in irrationality. Mind and Society, 4, 3-12. doi: 10.1007/s11299-005-0001-z
  4. Bejar, I., & Weiss, D.J., (1977) A comparison of empricial differential of inter-item correlation. Educational and Pyschological Measurement. 37, 335-340. doi: 10.1177/001316447703700207
  5. Bereby-Meyer, Y., Meyer, Y., & Flascher, O. M. (2002). Prospect theory analysis of guessing in multiple choice tests. Journal of Behavioral Decision Making, 15, 313–327. doi: 10.1002/bdm.417
  6. Buckles, S., & Siegfried, J.J., (2006). Using in-depth multiple-choice questions to evaluate in-depth learning of economics. Journal of Economics Education, 37, 48-57. doi: 10.3200/JECE.37.1.48-57.
  7. Budescu, D. V., & Bar-Hillel, M. (1993). To guess or not to guess: A decision-theoretic view of formula scoring. Journal of Educational Measurement, 30(4), 277–291. doi: 10.1111/j.1745-3984.1993.tb00427.x
  8. Budescu, D. V. (1979) Differential weighting of multiple-choice items. Educational Testing Service Princeton.

Details

Primary Language

English

Subjects

Other Fields of Education

Journal Section

Research Article

Publication Date

June 5, 2021

Submission Date

February 11, 2021

Acceptance Date

May 2, 2021

Published in Issue

Year 2021 Volume: 10 Number: 2

APA
Yaşar, M., Kartal, S., & Aybek, E. C. (2021). Scoring Methods for Multiple Choice Tests: How does the Item Difficulty Weighted Scoring Change Student’s Test Results? Bartın University Journal of Faculty of Education, 10(2), 309-324. https://doi.org/10.14686/buefad.878504
AMA
1.Yaşar M, Kartal S, Aybek EC. Scoring Methods for Multiple Choice Tests: How does the Item Difficulty Weighted Scoring Change Student’s Test Results? BUEFAD. 2021;10(2):309-324. doi:10.14686/buefad.878504
Chicago
Yaşar, Metin, Seval Kartal, and Eren Can Aybek. 2021. “Scoring Methods for Multiple Choice Tests: How Does the Item Difficulty Weighted Scoring Change Student’s Test Results?”. Bartın University Journal of Faculty of Education 10 (2): 309-24. https://doi.org/10.14686/buefad.878504.
EndNote
Yaşar M, Kartal S, Aybek EC (June 1, 2021) Scoring Methods for Multiple Choice Tests: How does the Item Difficulty Weighted Scoring Change Student’s Test Results? Bartın University Journal of Faculty of Education 10 2 309–324.
IEEE
[1]M. Yaşar, S. Kartal, and E. C. Aybek, “Scoring Methods for Multiple Choice Tests: How does the Item Difficulty Weighted Scoring Change Student’s Test Results?”, BUEFAD, vol. 10, no. 2, pp. 309–324, June 2021, doi: 10.14686/buefad.878504.
ISNAD
Yaşar, Metin - Kartal, Seval - Aybek, Eren Can. “Scoring Methods for Multiple Choice Tests: How Does the Item Difficulty Weighted Scoring Change Student’s Test Results?”. Bartın University Journal of Faculty of Education 10/2 (June 1, 2021): 309-324. https://doi.org/10.14686/buefad.878504.
JAMA
1.Yaşar M, Kartal S, Aybek EC. Scoring Methods for Multiple Choice Tests: How does the Item Difficulty Weighted Scoring Change Student’s Test Results? BUEFAD. 2021;10:309–324.
MLA
Yaşar, Metin, et al. “Scoring Methods for Multiple Choice Tests: How Does the Item Difficulty Weighted Scoring Change Student’s Test Results?”. Bartın University Journal of Faculty of Education, vol. 10, no. 2, June 2021, pp. 309-24, doi:10.14686/buefad.878504.
Vancouver
1.Metin Yaşar, Seval Kartal, Eren Can Aybek. Scoring Methods for Multiple Choice Tests: How does the Item Difficulty Weighted Scoring Change Student’s Test Results? BUEFAD. 2021 Jun. 1;10(2):309-24. doi:10.14686/buefad.878504

Cited By

All the articles published in the journal are open access and distributed under the conditions of CommonsAttribution-NonCommercial 4.0 International License 

88x31.png


Bartın University Journal of Faculty of Education