Derleme
BibTex RIS Kaynak Göster

Sağlık Alanından Örneklerle Çoktan Seçmeli Maddelerin Üst Düzey Düşünme Becerilerine Yönelik Tasarlanması için Öneriler

Yıl 2022, Cilt: 25 Sayı: 2, 117 - 125, 30.06.2022
https://doi.org/10.5152/JANHS.2022.875827

Öz

Bu araştırmanın amacı üst düzey düşünme becerilerini ölçebilecek çoktan seçmeli madde hazırlarken dikkat edilmesi gereken noktaları göstermek ve var olan genel yanılgının aksine çoktan seçmeli maddelerle üst düşünme becerilerinin de ölçülebileceğine dikkat çekmektir. Sınıf içi değerlendirmelerden geniş ölçekli testlere kadar eğitim alanının birçok kısmında büyük bir yer kapladığı tartışılmaz bir gerçek olan çoktan seçmeli maddeleri daha üst düzey becerilere hitap edecek şekilde yapılandırmak oldukça önemlidir. Ancak bu tür maddelerin yaygın kullanımına karşın
kaliteli ve üst düzey becerileri ölçebilecek çoktan seçmeli maddeler oluşturmak hâlâ öğretmenler, eğitimciler ve test geliştiriciler için zor olmaya devam etmektedir. Çalışmada üst düzey becerileri
ölçebilecek çoktan seçmeli madde yazımı için literatürde önerilen tekniklere yer verilmiş ve bir dizi pratik öneriler sunulmuştur. Bahsedilen tekniklerle çoktan seçmeli maddelerin daha üst düzey
düşünme becerilerini ölçebileceği sonucuna varılmıştır. Bu tekniklerin tanıtılması ve derlenmesiyle oluşturulan bu çalışmanın hem öğrencilere hem de test geliştiricilere yarar sağlayabileceği
düşünülmektedir.

Kaynakça

  • 1. Kelly FJ. The Kansas silent reading tests. J Educ Psychol. 1916;7(2):63- 80. [CrossRef]
  • 2. Lukhele R, Thissen D, Wainer H. On the relative value of multiplechoice, constructed response, and examinee-selected items on two achievement tests. J Educ Meas. 1994;31(3):234-250. [CrossRef]
  • 3. Scully D. Constructing multiple-choice items to measure higherorder thinking. Pract Assess Res Eval. 2017;22(4):1-13.
  • 4. Burton SJ, Sudweeks RR, Merrill PF, Wood B. How to Prepare Better Multiple-Choice Test Items: Guidelines for University Faculty. Utah: Brigham Young University Testing Services and the Department of Instructional Science; 1991;1-33.
  • 5. Lemons PP, Lemons JD. Questions for assessing higher-order cognitive skills: it’s not just Bloom’s. Sci Educ. 2013;12(1):47-58. [CrossRef]
  • 6. Istiyono E. The development of physics essay test for higher order thinking skills in junior high school. Proceeding of International Conference on Research, Implementation and Education of Mathematics and Sciences 2014 May 18-20; Yogyakarta State University.
  • 7. Newmann FM. Higher order thinking in the teaching of social studies: Connections between theory and practice. In: Voss J, Perkins D, Segal J, eds. Informal Reasoning and Education. Hillsdale, NJ: Lawrence Erlbaum; 1991:381-400.
  • 8. Daly WM. Critical thinking as an outcome of nursing education. What is it? Why is it important to nursing practice? J Adv Nurs. 1998;28(2):323-331. [CrossRef]
  • 9. Schroeder JM. A Study of Improving Critical Thinking Skills with Multiple Choice Tests and First Semester Associate Degree Nursing Students. Doctoral dissertation. Minneapolis, MN: Capella University; 2007.
  • 10. Dickinson M. Writing multiple-choice questions for higher-level thinking. Learning Solutions Magazine. Available at: https://campbel lms.typepad.com/files/writing-multiple-choice-questions-for-highe r-level-thinking-by-mike-dickinson-_-learning-solutions-magazine. pdf. Accessed March 2, 2020.
  • 11. Forehand M. (2005) Bloom’s Taxonomy: Original and Revised. In M. Orey (Ed.), Emerging Perspectives on Learning, Teaching, and Technology (E-Book). https://textbookequity.org/Textbooks/Orey_Emergin_ Perspectives_Learning.pdf http://epltt.coe.uga.edu/index.php?title =Bloom%27s_Taxonomy. Accessed January 20, 2011.
  • 12. Bloom B, Englehart M, Furst E, Hill W, Krathwohl D. A Taxonomy of Educational Objectives, Handbook I: Cognitive Domain. New York: David McKay Company; 1956.
  • 13. Bloom B. Reflections on the development and use of the taxonomy. In: In A. L, Sosniak L, eds. Bloom’s Taxonomy: A Forty-Year Retrospective. Chicago: The University of Chicago Press; 1994:1-8.
  • 14. Anderson LW, Krathwohl DR, Airasian PW, et al. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. complete ed. New York: Longman; 2001.
  • 15. Wilson LO. Anderson and Krathwohl–Bloom’s taxonomy revised. Understanding the new version of Bloom’s taxonomy. Available at: https://quincycollege.edu/content/uploads/Anderson-and-Krath wohl_Revised-Blooms-Taxonomy.pdf. Accessed March 11, 2020.
  • 16. Wiggins G. Five unfortunate misunderstanding that almost all educators have about Bloom’s Taxonomy [blog post]. Available at: https ://grantwiggins.wordpress.com/2015/03/04/5-unfortunate-misun derstandingsthat-almost-all-educators-have-about-blooms-taxo nomy/. Accessed March 11, 2020.
  • 17. Jovanovska J. Designing effective multiple-choice questions for assessing learning outcomes. Infotheca, J Digit Humanit. 2018;18(1): 25-42.
  • 18. Morrison S, Free KW. Writing multiple-choice test items that promote and measure critical thinking. J Nurs Educ. 2001;40(1):17-24. [CrossRef]
  • 19. Crowe A, Dirks C, Wenderoth MP. Biology in bloom: Implementing Bloom’s taxonomy to enhance student learning in biology. CBE— Life. CBE Life Sci Educ. 2008;7(4):368-381. [CrossRef].
  • 20. Brookhart SM. How to Assess Higher-Order Thinking Skills in Your Classroom. Alexandria: ASCD; 2010.
  • 21. Clay B. A Short Guide to Writing Effective Test Questions. Lawrence: Kansas Curriculum Center, University of Kansas; 2001.
  • 22. Azer SA. Assessment in a problem-based learning course: Twelve tips for constructing multiple choice questions that test students’ cognitive skills. Biochem Mol Biol Educ. 2003;31(6):428-434. [CrossRef]
  • 23. Burns ER. “Anatomizing” reversed: Use of examination questions that foster use of higher order learning skills by students. Anat Sci Educ. 2010;3(6):330-334. [CrossRef]
  • 24. Haladyna TM, Downing SM, Rodriguez MC. A review of multiplechoice item-writing guidelines for classroom assessment. Appl Meas Educ. 2002;15(3):309-333. [CrossRef]
  • 25. Collins J. Education techniques for lifelong learning: Writing multiple-choice questions for continuing medical education activities and self-assessment modules. RadioGraphics. 2006;26(2): 543-551. [CrossRef]
  • 26. Zaidi NB, Hwang C, Scott S, Stallard S, Purkiss J, Hortsch M. Climbing Bloom’s taxonomy pyramid: Lessons from a graduate histology course. Anat Sci Educ. 2017;10(5):456-464. [CrossRef]
  • 27. Abdalla ME, Gaffar AM, Suliman RA. Constructing A-type Multiple Choice Questions (MCQs): Step by Step Manual. Blueprints in Health Profession Education Series. 2011. Retrieved from https://www.res earchgate.net/publication/235914088
  • 28. Haladyna TM. Developing and Validating MuItiple-Choice Test Items. Hillsdale, NJ: Lawrence Erlbaum Associates; 1994.
  • 29. Ascalon ME, Meyers LS, Davis BW, Smits N. Distractor similarity and item-stem structure: effects on item difficulty. Appl Meas Educ. 2007;20(2):153-170. [CrossRef]
  • 30. DiBattista D, Kurzawa L. Examination of the quality of multiplechoice items on classroom tests. cjsotl-rcacea. 2011;2(2):1-23. [CrossRef]
  • 31. Hancock GR. Cognitive complexity and the comparability of multiple-choice and constructed-response test formats. J Exp Educ. 1994;62(2):143-157. [CrossRef]
  • 32. Ericsson KA, Simon H. Protocol Analysis: Verbal Reports as Data. rev. ed. Cambridge, MA: MIT Press; 1993.
  • 33. Çelik H, Kocabıyık E, Sönmezer Ü. Kocaeli İl Milli Eğitim Müdürlüğü Ölçme ve Değerlendirme Merkezi tarafından hazırlanan 6. ve 7. Sınıf matematik dersi ortak yazılı sınav sorularının madde türlerine ve yenilenmiş Bloom taksonomisine göre değerlendirilmesi. Ege Bilimsel Araştırmalar Derg. 2020;3(1):28-53.
  • 34. Gökulu A. Fen ve teknoloji öğretmenlerinin yazılı sınav soruları ile TEOG sınavlarında sorulan fen ve teknoloji sorularının yenilenmiş Bloom taksonomisine göre incelenmesi. Route Educational and Social Science Journal. 2015;2(2):434-446.
  • 35. Koç H, Sönmez ÖF, Çiftçi T. YGS ve LYS sınavlarındaki coğrafya sorularının Bloom taksonomisi bilişsel alan düzeyi açısından analizi. J Black Sea Stud. 2013;9(36):257-275.
  • 36. Özmen H. ÖSS sınavlarındaki kimya sorularının konu alanlarına ve Bloom taksonomisine göre incelenmesi. Eurasian J Educ Res (EJER). 1990-2005;21:187-199.
  • 37. Badat T, Usgu G, Dinler E, Bayramlar K, Yakut Y. Çoktan seçmeli sınavlarda kullanılan ölçme ve değerlendirme sisteminin uygulanması: Madde analiz örneği. Hacettepe Univ Fac Health Sci J. 2020;7(3): 285-295.
  • 38. Cansever Z, Acemoğlu H, Avşar Ü, Hoşoğlu S. Tıp fakültesindeki çoktan seçmeli sınav sorularının değerlendirilmesi. Tıp Eğitimi Dünyası. 2016;14(44):44-55. [CrossRef]
  • 39. Khorshid L, Zaybak A, Güneş ÜY, et al. Hemşirelik yüksekokulu çoktan seçmeli sınav sorularının analizi. Ege Univ Hemşirelik Fak Derg. 2010;26(2):37-48.
  • 40. Mahjabeen W, Alam S, Hassan U, et al. Difficulty index, discrimination index and distractor efficiency in multiple choice questions. Annals PIMS-Shaheed Zulfiqar Ali Bhutto Medical University. 2018;13(4): 310-315.
  • 41. Özcan KV, Aydoğan Y, Bulut İ. Gaziosmanpaşa Üniversitesi Tıp Fakültesinde uygulanan çoktan seçmeli sınavların betimsel analizi. Gaziosmanpaşa Univ Tıp Fak Derg. 2014;6(4):281-294.
  • 42. Topal K, Aybek H, Kara CO, Büke A, Aybek Z. PAÜ Tıp Fakültesi dönem ı öğrencilerine 2006-2007 eğitim ve öğretim yılında uygulanan çoktan seçmeli sınavların madde ve test analizleri. Pamukkale Tıp Derg. 2008;1(3):120-126.
  • 43. Tiemeier AM, Stacy ZA, Burke JM. Using multiple choice questions written at various Bloom’s taxonomy levels to evaluate student performance across a therapeutics sequence. Innov Pharm. 2011;2(2):1- 11. [CrossRef]
  • 44. Simkin MG, Kuechler WL. Multiple-choice tests and student understanding: What is the connection? Decis Sci J Innov Educ. 2005;3(1):73-98. [CrossRe

Suggestions for Designing Multiple Choice Items for High-Order Thinking Skills with Examples from the Healthcare Field

Yıl 2022, Cilt: 25 Sayı: 2, 117 - 125, 30.06.2022
https://doi.org/10.5152/JANHS.2022.875827

Öz

The purpose of this study is to show points that should be considered when preparing multiplechoice items that can measure high-level thinking skills and to draw attention to the fact that the higher thinking skills can be measured with multiple-choice items. From classroom evaluations to large-scale tests, it is very important to configure multiple-choice items which is an indisputable fact that many parts of the training area occupy a big place, to address higher-level skills. Nevertheless, despite the widespread use of such items, it is still difficult for teachers, educators, and test developers to write multiple-choice items that can measure quality and high-level skills. In the study, the techniques proposed in the literature for writing multiple-choice items that can measure high-level skills are given and a series of practical suggestions are presented. It has been concluded that the multiple-choice items can measure higher level thinking skills with the mentioned techniques. It is thought that this study, which is formed by introducing and compiling these techniques, can benefit both students and test developers in various contexts.

Kaynakça

  • 1. Kelly FJ. The Kansas silent reading tests. J Educ Psychol. 1916;7(2):63- 80. [CrossRef]
  • 2. Lukhele R, Thissen D, Wainer H. On the relative value of multiplechoice, constructed response, and examinee-selected items on two achievement tests. J Educ Meas. 1994;31(3):234-250. [CrossRef]
  • 3. Scully D. Constructing multiple-choice items to measure higherorder thinking. Pract Assess Res Eval. 2017;22(4):1-13.
  • 4. Burton SJ, Sudweeks RR, Merrill PF, Wood B. How to Prepare Better Multiple-Choice Test Items: Guidelines for University Faculty. Utah: Brigham Young University Testing Services and the Department of Instructional Science; 1991;1-33.
  • 5. Lemons PP, Lemons JD. Questions for assessing higher-order cognitive skills: it’s not just Bloom’s. Sci Educ. 2013;12(1):47-58. [CrossRef]
  • 6. Istiyono E. The development of physics essay test for higher order thinking skills in junior high school. Proceeding of International Conference on Research, Implementation and Education of Mathematics and Sciences 2014 May 18-20; Yogyakarta State University.
  • 7. Newmann FM. Higher order thinking in the teaching of social studies: Connections between theory and practice. In: Voss J, Perkins D, Segal J, eds. Informal Reasoning and Education. Hillsdale, NJ: Lawrence Erlbaum; 1991:381-400.
  • 8. Daly WM. Critical thinking as an outcome of nursing education. What is it? Why is it important to nursing practice? J Adv Nurs. 1998;28(2):323-331. [CrossRef]
  • 9. Schroeder JM. A Study of Improving Critical Thinking Skills with Multiple Choice Tests and First Semester Associate Degree Nursing Students. Doctoral dissertation. Minneapolis, MN: Capella University; 2007.
  • 10. Dickinson M. Writing multiple-choice questions for higher-level thinking. Learning Solutions Magazine. Available at: https://campbel lms.typepad.com/files/writing-multiple-choice-questions-for-highe r-level-thinking-by-mike-dickinson-_-learning-solutions-magazine. pdf. Accessed March 2, 2020.
  • 11. Forehand M. (2005) Bloom’s Taxonomy: Original and Revised. In M. Orey (Ed.), Emerging Perspectives on Learning, Teaching, and Technology (E-Book). https://textbookequity.org/Textbooks/Orey_Emergin_ Perspectives_Learning.pdf http://epltt.coe.uga.edu/index.php?title =Bloom%27s_Taxonomy. Accessed January 20, 2011.
  • 12. Bloom B, Englehart M, Furst E, Hill W, Krathwohl D. A Taxonomy of Educational Objectives, Handbook I: Cognitive Domain. New York: David McKay Company; 1956.
  • 13. Bloom B. Reflections on the development and use of the taxonomy. In: In A. L, Sosniak L, eds. Bloom’s Taxonomy: A Forty-Year Retrospective. Chicago: The University of Chicago Press; 1994:1-8.
  • 14. Anderson LW, Krathwohl DR, Airasian PW, et al. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. complete ed. New York: Longman; 2001.
  • 15. Wilson LO. Anderson and Krathwohl–Bloom’s taxonomy revised. Understanding the new version of Bloom’s taxonomy. Available at: https://quincycollege.edu/content/uploads/Anderson-and-Krath wohl_Revised-Blooms-Taxonomy.pdf. Accessed March 11, 2020.
  • 16. Wiggins G. Five unfortunate misunderstanding that almost all educators have about Bloom’s Taxonomy [blog post]. Available at: https ://grantwiggins.wordpress.com/2015/03/04/5-unfortunate-misun derstandingsthat-almost-all-educators-have-about-blooms-taxo nomy/. Accessed March 11, 2020.
  • 17. Jovanovska J. Designing effective multiple-choice questions for assessing learning outcomes. Infotheca, J Digit Humanit. 2018;18(1): 25-42.
  • 18. Morrison S, Free KW. Writing multiple-choice test items that promote and measure critical thinking. J Nurs Educ. 2001;40(1):17-24. [CrossRef]
  • 19. Crowe A, Dirks C, Wenderoth MP. Biology in bloom: Implementing Bloom’s taxonomy to enhance student learning in biology. CBE— Life. CBE Life Sci Educ. 2008;7(4):368-381. [CrossRef].
  • 20. Brookhart SM. How to Assess Higher-Order Thinking Skills in Your Classroom. Alexandria: ASCD; 2010.
  • 21. Clay B. A Short Guide to Writing Effective Test Questions. Lawrence: Kansas Curriculum Center, University of Kansas; 2001.
  • 22. Azer SA. Assessment in a problem-based learning course: Twelve tips for constructing multiple choice questions that test students’ cognitive skills. Biochem Mol Biol Educ. 2003;31(6):428-434. [CrossRef]
  • 23. Burns ER. “Anatomizing” reversed: Use of examination questions that foster use of higher order learning skills by students. Anat Sci Educ. 2010;3(6):330-334. [CrossRef]
  • 24. Haladyna TM, Downing SM, Rodriguez MC. A review of multiplechoice item-writing guidelines for classroom assessment. Appl Meas Educ. 2002;15(3):309-333. [CrossRef]
  • 25. Collins J. Education techniques for lifelong learning: Writing multiple-choice questions for continuing medical education activities and self-assessment modules. RadioGraphics. 2006;26(2): 543-551. [CrossRef]
  • 26. Zaidi NB, Hwang C, Scott S, Stallard S, Purkiss J, Hortsch M. Climbing Bloom’s taxonomy pyramid: Lessons from a graduate histology course. Anat Sci Educ. 2017;10(5):456-464. [CrossRef]
  • 27. Abdalla ME, Gaffar AM, Suliman RA. Constructing A-type Multiple Choice Questions (MCQs): Step by Step Manual. Blueprints in Health Profession Education Series. 2011. Retrieved from https://www.res earchgate.net/publication/235914088
  • 28. Haladyna TM. Developing and Validating MuItiple-Choice Test Items. Hillsdale, NJ: Lawrence Erlbaum Associates; 1994.
  • 29. Ascalon ME, Meyers LS, Davis BW, Smits N. Distractor similarity and item-stem structure: effects on item difficulty. Appl Meas Educ. 2007;20(2):153-170. [CrossRef]
  • 30. DiBattista D, Kurzawa L. Examination of the quality of multiplechoice items on classroom tests. cjsotl-rcacea. 2011;2(2):1-23. [CrossRef]
  • 31. Hancock GR. Cognitive complexity and the comparability of multiple-choice and constructed-response test formats. J Exp Educ. 1994;62(2):143-157. [CrossRef]
  • 32. Ericsson KA, Simon H. Protocol Analysis: Verbal Reports as Data. rev. ed. Cambridge, MA: MIT Press; 1993.
  • 33. Çelik H, Kocabıyık E, Sönmezer Ü. Kocaeli İl Milli Eğitim Müdürlüğü Ölçme ve Değerlendirme Merkezi tarafından hazırlanan 6. ve 7. Sınıf matematik dersi ortak yazılı sınav sorularının madde türlerine ve yenilenmiş Bloom taksonomisine göre değerlendirilmesi. Ege Bilimsel Araştırmalar Derg. 2020;3(1):28-53.
  • 34. Gökulu A. Fen ve teknoloji öğretmenlerinin yazılı sınav soruları ile TEOG sınavlarında sorulan fen ve teknoloji sorularının yenilenmiş Bloom taksonomisine göre incelenmesi. Route Educational and Social Science Journal. 2015;2(2):434-446.
  • 35. Koç H, Sönmez ÖF, Çiftçi T. YGS ve LYS sınavlarındaki coğrafya sorularının Bloom taksonomisi bilişsel alan düzeyi açısından analizi. J Black Sea Stud. 2013;9(36):257-275.
  • 36. Özmen H. ÖSS sınavlarındaki kimya sorularının konu alanlarına ve Bloom taksonomisine göre incelenmesi. Eurasian J Educ Res (EJER). 1990-2005;21:187-199.
  • 37. Badat T, Usgu G, Dinler E, Bayramlar K, Yakut Y. Çoktan seçmeli sınavlarda kullanılan ölçme ve değerlendirme sisteminin uygulanması: Madde analiz örneği. Hacettepe Univ Fac Health Sci J. 2020;7(3): 285-295.
  • 38. Cansever Z, Acemoğlu H, Avşar Ü, Hoşoğlu S. Tıp fakültesindeki çoktan seçmeli sınav sorularının değerlendirilmesi. Tıp Eğitimi Dünyası. 2016;14(44):44-55. [CrossRef]
  • 39. Khorshid L, Zaybak A, Güneş ÜY, et al. Hemşirelik yüksekokulu çoktan seçmeli sınav sorularının analizi. Ege Univ Hemşirelik Fak Derg. 2010;26(2):37-48.
  • 40. Mahjabeen W, Alam S, Hassan U, et al. Difficulty index, discrimination index and distractor efficiency in multiple choice questions. Annals PIMS-Shaheed Zulfiqar Ali Bhutto Medical University. 2018;13(4): 310-315.
  • 41. Özcan KV, Aydoğan Y, Bulut İ. Gaziosmanpaşa Üniversitesi Tıp Fakültesinde uygulanan çoktan seçmeli sınavların betimsel analizi. Gaziosmanpaşa Univ Tıp Fak Derg. 2014;6(4):281-294.
  • 42. Topal K, Aybek H, Kara CO, Büke A, Aybek Z. PAÜ Tıp Fakültesi dönem ı öğrencilerine 2006-2007 eğitim ve öğretim yılında uygulanan çoktan seçmeli sınavların madde ve test analizleri. Pamukkale Tıp Derg. 2008;1(3):120-126.
  • 43. Tiemeier AM, Stacy ZA, Burke JM. Using multiple choice questions written at various Bloom’s taxonomy levels to evaluate student performance across a therapeutics sequence. Innov Pharm. 2011;2(2):1- 11. [CrossRef]
  • 44. Simkin MG, Kuechler WL. Multiple-choice tests and student understanding: What is the connection? Decis Sci J Innov Educ. 2005;3(1):73-98. [CrossRe
Toplam 44 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Sağlık Kurumları Yönetimi
Bölüm Derlemeler
Yazarlar

Esin Yılmaz Koğar Bu kişi benim 0000-0001-6755-9018

Yayımlanma Tarihi 30 Haziran 2022
Yayımlandığı Sayı Yıl 2022 Cilt: 25 Sayı: 2

Kaynak Göster

AMA Yılmaz Koğar E. Sağlık Alanından Örneklerle Çoktan Seçmeli Maddelerin Üst Düzey Düşünme Becerilerine Yönelik Tasarlanması için Öneriler. Journal of Nursology. Haziran 2022;25(2):117-125. doi:10.5152/JANHS.2022.875827

31408