Research Article
BibTex RIS Cite

Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation

Year 2024, Volume: 8 Issue: 2, 490 - 497, 31.05.2024
https://doi.org/10.30621/jbachs.1427464

Abstract

References

  • Carneson J, Delpierre G, Masters K. Designing, and managing multiple choice questions. 2nd ed. 2016; pp. 3–6. University of Cape Town. Available from: https://www.researchgate.net/publication/309263856_Designing_and_Managing_Multiple_Choice_Questions_2nd_Ed=channel=doi&linkId=58074fef08ae03256b783474&showFulltext=true .
  • Coderre SP, Harasym P, Mandin H, Fick G. The impact of two multiple-choice question formats on the problem-solving strategies used by novices and experts. BMC Med Educ. 2004;4:23.
  • Gierl MJ, Lai H, Turner SR. Using automatic item generation to create multiple-choice test items. Med Educ 2012;46(8):757-765.
  • Palmer EJ, Devitt PG. Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper. BMC Med Educ 2007;7:49.
  • Tarrant M, Ware J. Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments. Med Educ 2008;42(2):198-206.
  • Tavakol M & Dennick R. Post-examination analysis of objective tests. Medical Teacher 2011;33(6):447–458.
  • Case SM, Swanson DB. Constructing written test questions for the basic and clinical sciences. 3rd ed. Philadelphia: National Board of Medical Examiners; 2000. Availablefrom:https://www.researchgate.net/publication/242759434_Constructing_Written_Test_Questions_For_the_Basic_and_Clinical_Sciences
  • Downing SM. The effects of violating standard item writing principles on tests and students: the consequences of using flawed test items on achievement examinations in medical education. Adv Health Sci Educ Theory Pract. 2005;10(2):133-143.
  • Haladyna TM, Downing SM & Rodriguez MC A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment, Applied Measurement in Education 2002;15(3):309-333.
  • Medical Council of Canada (MCC). Guidelines for the Development of Multiple-Choice Questions. Ottawa, ON: MCC 2010. Available from: https://mcc.ca/media/Multiple-choice-question-guidelines.pDI
  • Paniagua MA, Swygert K A editors. Constructing written test questions for the basic and clinical sciences (2016). Available from: https://www.bumc.bu.edu/busm/files/2018/10/NBME-Constructing-Written-Test-Questions.pDI
  • Sutherland K, Schwartz J, Dickison P. Best Practices for Writing Test Items. Journal of Nursing Regulation 2012;3(2):35-39.
  • Christian DS, Prajapati AC, Rana BM, Dave VR. Evaluation of multiple choice questions using item analysis tool: a study from a medical institute of Ahmedabad, Gujarat. Int J Community Med Public Health 2017;4(6):1876-81.
  • Date AP, Borkar AS, Badwaik RT, Siddiqui RA, Shende TR, & Dashputra AV. Item analysis as tool to validate multiple choice question bank in pharmacology. International Journal of Basic & Clinical Pharmacology 2019;8(9):1999–2003.
  • Rehman A, Aslam A & Hassan SH. Item analysis of multiple choice questions. Pakistan Oral & Dental Journal 2018;38(2): 291-293.
  • Gierl MJ, Bulut O, Guo Q & Zhang X. Developing, Analyzing, and Using Distractors for Multiple-Choice Tests in Education: A Comprehensive Review. Review of Educational Research 2017;87(6):1082–1116.
  • Hassan S & Hod R. Use of item analysis to improve the quality of single best answer multiple choice question in summative assessment of undergraduate medical students in Malaysia. Education in Medicine Journal 2017;9(3):33-5-43.
  • Rao C, Kishan Prasad H L, Sajitha K, Permi H, Shetty J. Item analysis of multiple choice questions: Assessing an assessment tool in medical students. Int J Educ Psychol Res 2016;2:201-4.
  • Kolte V. Item analysis of Multiple Choice Questions in Physiology examination. Indian Journal of Basic and Applied Medical Research; 2015;4(4):320-326.
  • Uddin I, Uddin İ, Rehman IU, Siyar M, Mehbob U. Item Analysis of Multiple Choice Questions in Pharmacology. J Saidu Med Coll Swat 2020;10(2):128-13.
  • Bhattacherjee S, Mukherjee A, Bhandari K, Rout AJ. Evaluation of Multiple-Choice Questions by Item Analysis, from an Online Internal Assessment of 6th Semester Medical Students in a Rural Medical College, West Bengal. Indian J Community Med. 2022;47(1):92-95.
  • Kumar D, Jaipurkar R, Shekhar A, Sikri G, & Srinivas V. Item analysis of multiple choice questions: A quality assurance test for an assessment tool. Medical journal, Armed Forces India, 2021;77(1):85–89.
  • Masters JC, Hulsmeyer BS, Pike ME, Leichty K, Miller MT & Verst AL. Assessment of multiple-choice questions in selected test banks accompanying text books used in nursing education. The Journal of Nursing Education 2001;40(1):25–32.
  • Walsh K. Advice on writing multiple choice questions (MCQs). BMJ 2005;330: 25 - 27.
  • Costello E, Holland JC, Kirwan C. Evaluation of MCQs from MOOCs for common item writing flaws. BMC Res Notes. 2018;11(1):849.
  • Downing SM. Construct-irrelevant variance and flawed test questions: Do multiple-choice item-writing principles make any difference?. Academic medicine : journal of the Association of American Medical Colleges 2002;77(10):103–104.
  • Gupta P, Meena P, Khan AM, Malhotra RK & Singh T. Effect of Faculty Training on Quality of Multiple-Choice Questions. International journal of applied & basic medical research 2020; 10(3):210–214.
  • Huang Yi-Min, Trevisan M, Storfer A. The Impact of the “all-of-the-above” Option and Student Ability on Multiple Choice Tests. International Journal for the Scholarship of Teaching and Learning 2007;1(2):11.
  • Scott KR, King AM, Estes MK, Conlon LW, Jones JS & Phillips AW. Evaluation of an Intervention to Improve Quality of Single-best Answer Multiple-choice Questions. The Western Journal of Emergency Medicine 2019;20(1):11–14.
  • Ali SH & Ruit KG. The Impact of item flaws, testing at low cognitive level, and low distractor functioning on multiple-choice question quality. Perspectives on medical education 2015; 4(5): 244–251.
  • Al-Lawama M & Kumwenda B. Decreasing the options' number in multiple choice questions in the assessment of senior medical students and its effect on exam psychometrics and distractors' function. BMC Medical Education 2023;23(1):212.
  • Belay LM, Sendekie TY & Eyowas FA. Quality of multiple-choice questions in medical internship qualification examination determined by item response theory at Debre Tabor University, Ethiopia. BMC Medical Education 2022;22(1):635.
  • Fozzard N, Pearson A, du Toit E, Naug H, Wen W & Peak IR. Analysis of MCQ and distractor use in a large first year Health Faculty Foundation Program: assessing the effects of changing from five to four options. BMC Medical Education 2018;18(1):252.
  • Pawade YR & Diwase DS. Can Item Analysis of MCQs Accomplish the Need of a Proper Assessment Strategy for Curriculum Improvement in Medical Education? i-manager’s Journal of Educational Technology 2016;13(1):44-53.
  • Rogausch A, Hofer R & Krebs R. Rarely selected distractors in high stakes medical multiple-choice examinations and their recognition by item authors: a simulation and survey. BMC medical education 2010;10:85.
  • Tarrant M, Ware J & Mohammed AM. An assessment of functioning and non-functioning distractors in multiple-choice questions: a descriptive analysis. BMC Medical Education, 2009; 9: 40.
  • Rahma A, Shamad M, Idris ME, Elfaki O, Elfakey W, Salih KM. Comparison in the quality of distractors in three and four options type of multiple-choice questions. Adv Med Educ Pract. 2017;8:287–91.
  • Kheyami D, Jaradat A, Al-Shibani T & Ali FA. Item Analysis of Multiple Choice Questions at the Department of Paediatrics, Arabian Gulf University, Manama, Bahrain. Sultan Qaboos University medical journal 2018;18(1):68–74.
  • Rodriguez MC. Three Options Are Optimal for Multiple-Choice Items: A Meta-Analysis of 80 Years of Research. Educational Measurement: Issues and Practice 2005; 24(2):3-13.
  • Hingorjo MR & Jaleel F. Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. JPMA. The Journal of the Pakistan Medical Association 2012;62(2): 142–147.

Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation

Year 2024, Volume: 8 Issue: 2, 490 - 497, 31.05.2024
https://doi.org/10.30621/jbachs.1427464

Abstract

Introduction: Multiple choice questions (MCQs) are widely used in medical education. Difficulty Index (DIF I), Discrimination Index (DI), and Distractor Efficiency (DE) are the most commonly used item analyses. MCQ authors should anticipate the possible difficulty level in writing their questions and carefully arrange distractors. This study aims to evaluate the quality of MCQ and the predictions of MCQ authors about the difficulty levels of their questions.
Methods: In this study, the DIF I, DI and DE values of 688 MCQs in the exams held in the first year of the 2021-2022 academic year of Dokuz Eylül University Faculty of Medicine were investigated. The effect of the number of Non-Functional Distractors (NFD) on DIF I and DI, and the compliance between the DIF I predictions of the question authors and the actual values was evaluated.
Results: DIF I mean was 0.57±0.21, and 47.5% was at the ideal difficulty level. There was a significant difference between the estimated and the actual DIF I (p=0.002). The DI average was 0.31 ± 0.17, and the discrimination level of 43.6% was excellent. 36.8% of distractors were NFD. MCQ's difficulty and discriminatory ability were significantly different according to the number of NFDs (p=0.000 for all).
Discussion: It was determined that the number of NFDs significantly affected difficulty and discriminatory ability. There was a difference between the estimates of the difficulty and the actual values. Reducing the number of options in MCQs and being more careful when crafting questions can improve the quality of the questions.

Ethical Statement

The study was conducted after the approval of the Ethics Committee of Dokuz Eylül University for Non-interventional Studies was obtained (file no:7734-GOA, Decision No:2023/02-11).

Supporting Institution

no

References

  • Carneson J, Delpierre G, Masters K. Designing, and managing multiple choice questions. 2nd ed. 2016; pp. 3–6. University of Cape Town. Available from: https://www.researchgate.net/publication/309263856_Designing_and_Managing_Multiple_Choice_Questions_2nd_Ed=channel=doi&linkId=58074fef08ae03256b783474&showFulltext=true .
  • Coderre SP, Harasym P, Mandin H, Fick G. The impact of two multiple-choice question formats on the problem-solving strategies used by novices and experts. BMC Med Educ. 2004;4:23.
  • Gierl MJ, Lai H, Turner SR. Using automatic item generation to create multiple-choice test items. Med Educ 2012;46(8):757-765.
  • Palmer EJ, Devitt PG. Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper. BMC Med Educ 2007;7:49.
  • Tarrant M, Ware J. Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments. Med Educ 2008;42(2):198-206.
  • Tavakol M & Dennick R. Post-examination analysis of objective tests. Medical Teacher 2011;33(6):447–458.
  • Case SM, Swanson DB. Constructing written test questions for the basic and clinical sciences. 3rd ed. Philadelphia: National Board of Medical Examiners; 2000. Availablefrom:https://www.researchgate.net/publication/242759434_Constructing_Written_Test_Questions_For_the_Basic_and_Clinical_Sciences
  • Downing SM. The effects of violating standard item writing principles on tests and students: the consequences of using flawed test items on achievement examinations in medical education. Adv Health Sci Educ Theory Pract. 2005;10(2):133-143.
  • Haladyna TM, Downing SM & Rodriguez MC A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment, Applied Measurement in Education 2002;15(3):309-333.
  • Medical Council of Canada (MCC). Guidelines for the Development of Multiple-Choice Questions. Ottawa, ON: MCC 2010. Available from: https://mcc.ca/media/Multiple-choice-question-guidelines.pDI
  • Paniagua MA, Swygert K A editors. Constructing written test questions for the basic and clinical sciences (2016). Available from: https://www.bumc.bu.edu/busm/files/2018/10/NBME-Constructing-Written-Test-Questions.pDI
  • Sutherland K, Schwartz J, Dickison P. Best Practices for Writing Test Items. Journal of Nursing Regulation 2012;3(2):35-39.
  • Christian DS, Prajapati AC, Rana BM, Dave VR. Evaluation of multiple choice questions using item analysis tool: a study from a medical institute of Ahmedabad, Gujarat. Int J Community Med Public Health 2017;4(6):1876-81.
  • Date AP, Borkar AS, Badwaik RT, Siddiqui RA, Shende TR, & Dashputra AV. Item analysis as tool to validate multiple choice question bank in pharmacology. International Journal of Basic & Clinical Pharmacology 2019;8(9):1999–2003.
  • Rehman A, Aslam A & Hassan SH. Item analysis of multiple choice questions. Pakistan Oral & Dental Journal 2018;38(2): 291-293.
  • Gierl MJ, Bulut O, Guo Q & Zhang X. Developing, Analyzing, and Using Distractors for Multiple-Choice Tests in Education: A Comprehensive Review. Review of Educational Research 2017;87(6):1082–1116.
  • Hassan S & Hod R. Use of item analysis to improve the quality of single best answer multiple choice question in summative assessment of undergraduate medical students in Malaysia. Education in Medicine Journal 2017;9(3):33-5-43.
  • Rao C, Kishan Prasad H L, Sajitha K, Permi H, Shetty J. Item analysis of multiple choice questions: Assessing an assessment tool in medical students. Int J Educ Psychol Res 2016;2:201-4.
  • Kolte V. Item analysis of Multiple Choice Questions in Physiology examination. Indian Journal of Basic and Applied Medical Research; 2015;4(4):320-326.
  • Uddin I, Uddin İ, Rehman IU, Siyar M, Mehbob U. Item Analysis of Multiple Choice Questions in Pharmacology. J Saidu Med Coll Swat 2020;10(2):128-13.
  • Bhattacherjee S, Mukherjee A, Bhandari K, Rout AJ. Evaluation of Multiple-Choice Questions by Item Analysis, from an Online Internal Assessment of 6th Semester Medical Students in a Rural Medical College, West Bengal. Indian J Community Med. 2022;47(1):92-95.
  • Kumar D, Jaipurkar R, Shekhar A, Sikri G, & Srinivas V. Item analysis of multiple choice questions: A quality assurance test for an assessment tool. Medical journal, Armed Forces India, 2021;77(1):85–89.
  • Masters JC, Hulsmeyer BS, Pike ME, Leichty K, Miller MT & Verst AL. Assessment of multiple-choice questions in selected test banks accompanying text books used in nursing education. The Journal of Nursing Education 2001;40(1):25–32.
  • Walsh K. Advice on writing multiple choice questions (MCQs). BMJ 2005;330: 25 - 27.
  • Costello E, Holland JC, Kirwan C. Evaluation of MCQs from MOOCs for common item writing flaws. BMC Res Notes. 2018;11(1):849.
  • Downing SM. Construct-irrelevant variance and flawed test questions: Do multiple-choice item-writing principles make any difference?. Academic medicine : journal of the Association of American Medical Colleges 2002;77(10):103–104.
  • Gupta P, Meena P, Khan AM, Malhotra RK & Singh T. Effect of Faculty Training on Quality of Multiple-Choice Questions. International journal of applied & basic medical research 2020; 10(3):210–214.
  • Huang Yi-Min, Trevisan M, Storfer A. The Impact of the “all-of-the-above” Option and Student Ability on Multiple Choice Tests. International Journal for the Scholarship of Teaching and Learning 2007;1(2):11.
  • Scott KR, King AM, Estes MK, Conlon LW, Jones JS & Phillips AW. Evaluation of an Intervention to Improve Quality of Single-best Answer Multiple-choice Questions. The Western Journal of Emergency Medicine 2019;20(1):11–14.
  • Ali SH & Ruit KG. The Impact of item flaws, testing at low cognitive level, and low distractor functioning on multiple-choice question quality. Perspectives on medical education 2015; 4(5): 244–251.
  • Al-Lawama M & Kumwenda B. Decreasing the options' number in multiple choice questions in the assessment of senior medical students and its effect on exam psychometrics and distractors' function. BMC Medical Education 2023;23(1):212.
  • Belay LM, Sendekie TY & Eyowas FA. Quality of multiple-choice questions in medical internship qualification examination determined by item response theory at Debre Tabor University, Ethiopia. BMC Medical Education 2022;22(1):635.
  • Fozzard N, Pearson A, du Toit E, Naug H, Wen W & Peak IR. Analysis of MCQ and distractor use in a large first year Health Faculty Foundation Program: assessing the effects of changing from five to four options. BMC Medical Education 2018;18(1):252.
  • Pawade YR & Diwase DS. Can Item Analysis of MCQs Accomplish the Need of a Proper Assessment Strategy for Curriculum Improvement in Medical Education? i-manager’s Journal of Educational Technology 2016;13(1):44-53.
  • Rogausch A, Hofer R & Krebs R. Rarely selected distractors in high stakes medical multiple-choice examinations and their recognition by item authors: a simulation and survey. BMC medical education 2010;10:85.
  • Tarrant M, Ware J & Mohammed AM. An assessment of functioning and non-functioning distractors in multiple-choice questions: a descriptive analysis. BMC Medical Education, 2009; 9: 40.
  • Rahma A, Shamad M, Idris ME, Elfaki O, Elfakey W, Salih KM. Comparison in the quality of distractors in three and four options type of multiple-choice questions. Adv Med Educ Pract. 2017;8:287–91.
  • Kheyami D, Jaradat A, Al-Shibani T & Ali FA. Item Analysis of Multiple Choice Questions at the Department of Paediatrics, Arabian Gulf University, Manama, Bahrain. Sultan Qaboos University medical journal 2018;18(1):68–74.
  • Rodriguez MC. Three Options Are Optimal for Multiple-Choice Items: A Meta-Analysis of 80 Years of Research. Educational Measurement: Issues and Practice 2005; 24(2):3-13.
  • Hingorjo MR & Jaleel F. Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. JPMA. The Journal of the Pakistan Medical Association 2012;62(2): 142–147.
There are 40 citations in total.

Details

Primary Language English
Subjects Health Services and Systems (Other)
Journal Section Research Article
Authors

Serap Konakcı 0000-0002-3325-6382

Publication Date May 31, 2024
Submission Date January 29, 2024
Acceptance Date May 10, 2024
Published in Issue Year 2024 Volume: 8 Issue: 2

Cite

APA Konakcı, S. (2024). Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation. Journal of Basic and Clinical Health Sciences, 8(2), 490-497. https://doi.org/10.30621/jbachs.1427464
AMA Konakcı S. Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation. JBACHS. May 2024;8(2):490-497. doi:10.30621/jbachs.1427464
Chicago Konakcı, Serap. “Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation”. Journal of Basic and Clinical Health Sciences 8, no. 2 (May 2024): 490-97. https://doi.org/10.30621/jbachs.1427464.
EndNote Konakcı S (May 1, 2024) Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation. Journal of Basic and Clinical Health Sciences 8 2 490–497.
IEEE S. Konakcı, “Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation”, JBACHS, vol. 8, no. 2, pp. 490–497, 2024, doi: 10.30621/jbachs.1427464.
ISNAD Konakcı, Serap. “Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation”. Journal of Basic and Clinical Health Sciences 8/2 (May 2024), 490-497. https://doi.org/10.30621/jbachs.1427464.
JAMA Konakcı S. Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation. JBACHS. 2024;8:490–497.
MLA Konakcı, Serap. “Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation”. Journal of Basic and Clinical Health Sciences, vol. 8, no. 2, 2024, pp. 490-7, doi:10.30621/jbachs.1427464.
Vancouver Konakcı S. Item Analysis in Multiple Choice Questions: A Study on Question Difficulty and Authors’ Evaluation. JBACHS. 2024;8(2):490-7.