Research Article
BibTex RIS Cite

HEMŞİRELİK FAKÜLTESI ÖLÇME VE DEĞERLENDİRME LABORATUARI ALT YAPI ÇALIŞMASI: TANIMLAYICI ARAŞTIRMA

Year 2021, Volume: 4 Issue: 2, 169 - 182, 30.09.2021
https://doi.org/10.52538/iduhes.949794

Abstract

Bu çalışmada hemşirelik fakültesi ölçme ve değerlendirme laboratuvarının kurularak eğitim programında uygulanan sınavların kalitesinin değerlendirilmesi amacıyla yapılmıştır. Tanımlayıcı kesitsel tipteki çalışmanın veriler, hemşirelik mesleki derslerinin toplam 42 adet sınavı ve 2671 test maddesidir. Araştırma, Hemşirelik Fakültesi’nde 26.06.2014-28.06.2018 tarihleri arasında yürütülmüştür. İlk aşamada ölçme değerlendirme laboratuvarı oluşturuldu. İkinci aşamada eğitici eğitimleri gerçekleştirildi. Üçüncü aşamada hemşirelik mesleki sınavlarının madde güçlük ve madde ayırtedicilik indeksleri belirlendi. Veriler Statistical Package For Social Science (SPSS) 21.0 programında değerlendirildi. Verilerin analizinde sayı ve yüzde dağılımları ve ortalamaları alındı. Hemşirelik derslerinin tüm test maddelerinin 2015-2016 öğretim yılında %40.49’u ve 2016-2017 öğretim yılında %38.90’ı çok kolay düzeyde (0.81-1.00) madde güçlük indeksine sahip olduğu ve 2015-2016 öğretim yılında %46.83’ü ve 2016-2017 öğretim yılında %43.56’sı çok düşük düzeyde (0.19 ve altı) madde ayırt edicilik indeksine sahip olduğu belirlenmiştir. Sonuç olarak hemşirelik derslerinin sınavlarının büyük bir kısmının kolay ve tüm sınavların ayırıcılığının çok düşük düzeyde olduğu belirlendi. Gelecekteki araştırmaların hemşirelik mesleki bilgilerini ölçen çoktan seçmeli soruların madde analizlerini ve madde yazım hatalarını belirlemeye yönelik yapılması önerilir.

Supporting Institution

Ege Üniversitesi Bilimsel Araştırma Projeleri

Project Number

14-HYO-001

References

  • Abdulghani, H. M., Ahmad, F., Irshad, M., Khalil, M. S., Al-Shaikh, G. K., Syed, S., … Haque, S. (2015). Faculty development programs improve the quality of Multiple Choice Questions items’ writing. Scientific Reports, 5(9556), 1–6. https://doi.org/10.1038/srep09556
  • AlFaris, E., Naeem, N., Irfan, F., Qureshi, R., Saad, H., Al Sadhan, R., … Van der Vleuten, C. (2015). A One-Day Dental Faculty Workshop in Writing Multiple-Choice Questions: An Impact Evaluation. Journal of Dental Education, 79(11), 1305–1313.
  • Atılgan, H. (2011). Eğitimde Ölçme ve Değerlendirme (4. Baskı). Ankara: Anı yayıncılık.
  • Brady, A. M. (2005). Assessment of learning with multiple-choice questions. Nurse Education in Practice, 5(4), 238–242. https://doi.org/10.1016/j.nepr.2004.12.005
  • Case, S. M., & Swanson, D. B. (2001). Constructing Written Test Questions For the Basic and Clinical Sciences. National Board of Medical Examiner, (June), 1–181. https://doi.org/10.1111/j.1365-2990.2004.00612.x
  • Clifton, S. L., & Schriner, C. L. (2010). Assessing the quality of multiple-choice test items. Nurse Educator, 35(1), 12–16. https://doi.org/10.1097/NNE.0b013e3181c41fa3
  • D’Sa, J. L., & Visbal-Dionaldo, M. L. (2017). Analysis of Multiple Choice Questions: Item Difficulty, Discrimination Index and Distractor Efficiency. International Journal of Nursing Education, 9(3), 109. https://doi.org/10.5958/0974-9357.2017.00079.4
  • Dascalu, C. G., Enache, A. M., Mavru, R. B., & Zegan, G. (2015). Computer-based MCQ Assessment for Students in Dental Medicine–Advantages and Drawbacks. Procedia - Social and Behavioral Sciences, 187, 22–27. https://doi.org/10.1016/j.sbspro.2015.03.005
  • Dell, K. A., & Wantuch, G. A. (2017). How-to-guide for writing multiple choice questions for the pharmacy instructor. Currents in Pharmacy Teaching and Learning, 9(1), 137–144. https://doi.org/10.1016/j.cptl.2016.08.036
  • Dönder, A., Elaldı, Ş., & Özkaya, Ö. M. (2012). Views of Lecturers on Complementary Methods of Measurement And Evaluation That Can be Used In Basic English Teaching at University Level. Turkish Studies, 7(1), 953–968.
  • Downing, S. M. (2005). The effects of violating standard item writing principles on tests and students: The consequences of using flawed test items on achievement examinations in medical education. Advances in Health Sciences Education, 10(2), 133–143. https://doi.org/10.1007/s10459-004-4019-5
  • Erkuş, A. (2006). Sınıf öğretmenleri için ölçme ve değerlendirme: kavramlar ve uygulamalar. Ankara: Ekinoks Yayınları.
  • Gelbal, S., & Kelecioglu, H. (2007). Teachers’ Proficiency Perceptions of about The Measurement and Evaluation Techniques and The Problems They Confront. Hacettepe University Journal of Education, 33, 135–145.
  • Gupta, P., Meena, P., Khan, A., Malhotra, R., & Singh, T. (2020). Effect of Faculty Training on Quality of Multiple‑Choice Questions. International Journal of Applied and Basic Medical Research, 10, 210–214. https://doi.org/10.4103/ijabmr.IJABMR
  • Hicks, N. A. (2011). Guidelines for identifying and revising culturally biased multiple-choice nursing examination items. Nurse Educator, 36(6), 266–270. https://doi.org/10.1097/NNE.0b013e3182333fd2
  • Hingorjo, M. R., & Jaleel, F. (2012). Analysis of one-best MCQs: The difficulty index, discrimination index and distractor efficiency. Journal of the Pakistan Medical Association, 62(2), 142–147. https://doi.org/10.1080/13658810701731168
  • Linn, R., & Miller, M. (2005). Measurement and assessment in Teaching (9th Editio). Upper Saddle River: NJ: Pearson Education.
  • Macerata, I. M., Costa, L. A., & Lages e Silva, R. L. (2018). Brutalists Walytopias: Operations for a savage clinic. Arquivos Brasileiros de Psicologia, 70(1), 146–160. https://doi.org/10.1016/j.jtumed.2016.08.014
  • Mahjabeen, W., Alam, S., Hassan, U., Zafar, T., Butt, R., Konain, S., & Rizvi, M. (2018). Difficulty Index , Discrimination Index and Distractor Efficiency in Multiple Choice Questions. Annal of PIMS, 12(March).
  • Mehmood, B., Sultan, S., Azhar, M., Shakoor, A., Khan, N. B., & Niazi, Z. (2021). Quality analysis of multiple choice questions. Community Dentistry, 40(4), 236–239.
  • Morrison, S., & Free, K. W. (2001). Writing multiple-choice test items that promote and measure critical thinking. The Journal of Nursing Education, 40(1), 17–24.
  • Nartgün, Z., Çetiner, B., Gömleksiz, M., Bıçak, B., Karaca, E., & Yurdabakan, İ. (2016). Eğitimde Ölçme ve Değerlendirme (4. Baskı; S. Erkan, ed.). Ankara: Nobel Akademik Yayıncılık.
  • Nedeau-Cayo, R., Laughlin, D., Rus, L., & Hall, J. (2013). Assessment of item-writing flaws in multiple-choice questions. Journal for Nurses in Professional Development, 29(2), 52–57. https://doi.org/10.1097/NND.0b013e318286c2f1
  • Nemec, E. C., & Welch, B. (2016). The impact of a faculty development seminar on the quality of multiple-choice questions. Currents in Pharmacy Teaching and Learning, 8(2), 160–163. https://doi.org/10.1016/j.cptl.2015.12.008
  • Przymuszała, P., Piotrowska, K., Lipski, D., Marciniak, R., & Cerbin-Koczorowska, M. (2020). Guidelines on Writing Multiple Choice Questions: A Well-Received and Effective Faculty Development Intervention. SAGE Open, 10(3), 1–12. https://doi.org/10.1177/2158244020947432
  • Quinn, F. (2000). The Principles and Practice of Nurse Education (fourth ed.). Cheltenham: Stanley Thorne (Publishers) Ltd.
  • Race, P., & Brown, S. (2001). The Lecturers Toolkit (second ed.). London.: Kogan Page.
  • Rush, B. R., Rankin, D. C., & White, B. J. (2016). The impact of item-writing flaws and item complexity on examination item difficulty and discrimination value. BMC Medical Education, 16(250), 1–10. https://doi.org/10.1186/s12909-016-0773-3
  • Sahin, H., Atay, Y., Yagdi, T., & Aka, S. A. (2017). Item analyses of 2016 written exam of Turkish Society of Cardiovascular Surgery Proficiency Board. Turkish Journal of Thoracic and Cardiovascular Surgery, 25(4), 600–607. https://doi.org/10.5606/tgkdc.dergisi.2017.14513
  • Semerci, Ç. (2007). A View to the New Primary School Curricula With the Metaphors Relating to “Curriculum Development.” Cumhuriyet University Faculty of Literature Journal of Social Sciences, 31(2), 125–140.
  • Tarrant, M., Knierim, A., Hayes, S. K., & Ware, J. (2006). The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Education in Practice, 6(6), 354–363. https://doi.org/10.1016/j.nepr.2006.07.002
  • Tarrant, M., & Ware, J. (2012). A Framework for improving the quality of multiple-choice assessments. Nurse Educator, 37(3), 98–104. https://doi.org/10.1097/NNE.0b013e31825041d0
  • Tarrant, M., Ware, J., & Mohammed, A. M. (2009). An assessment of functioning and non-functioning distractors in multiple-choice questions: A descriptive analysis. BMC Medical Education, 9(1), 1–8. https://doi.org/10.1186/1472-6920-9-40
  • Topal, K., Aybek, H., Kara Orhan, C., Büke, A., & Aybek, Z. (2008). Item and Test Analysis of Multiple-Choice Exams Which Applied to The First Term Students Of Pamukkale University Medical Faculty In 2006-2007 Academic Year. Pamukkale Medical Journal, 1(3), 120–126.
  • Yeşilyurt, E. (2012). Öğretmen Adaylarının Ölçme Ve Değerlendirme Alanına Ilişkin Genel Yeterlik Algıları. Mustafa Kemal Üniversitesi Sosyal Bilimler Enstitüsü Dergisi, 9(17), 377–395.

ESTABLISHING A MEASUREMENT AND EVALUATION LABORATORY IN A NURSING FACULTY: A DESCRIPTIVE STUDY

Year 2021, Volume: 4 Issue: 2, 169 - 182, 30.09.2021
https://doi.org/10.52538/iduhes.949794

Abstract

Aim of this study, to establish a measurement and evaluation laboratory of a nursing faculty and evaluate the quality of tests applied as part of the education curriculum as a Descriptive cross-sectional design. A total of 42 tests related to nursing vocational courses and 2,671 test items. The research was conducted in a state university nursing faculty between June 26, 2014 and June 28, 2018. In the first stage, a measurement and evaluation laboratory was established. In the second stage, trainers were trained. In the third stage, the item difficulty and item discrimination indexes of nursing professional tests were determined. Number and percentage distributions and averages were obtained in the analysis of the data. Of all the test items of vocational nursing courses, 40.49% in the 2015-2016 academic year and 38.90% in the 2016-2017 academic year were very easy according to the item difficulty index (0.81-1.00), and 46.83% and 43.56%, respectively had very low item discrimination indexes (0.19 and below). Most of the tests applied in nursing courses were easy, and the discrimination of all tests was very low. Further studies are recommended to investigate the item analysis of multiple-choice questions that measure nursing professional knowledge and item spelling errors.

Project Number

14-HYO-001

References

  • Abdulghani, H. M., Ahmad, F., Irshad, M., Khalil, M. S., Al-Shaikh, G. K., Syed, S., … Haque, S. (2015). Faculty development programs improve the quality of Multiple Choice Questions items’ writing. Scientific Reports, 5(9556), 1–6. https://doi.org/10.1038/srep09556
  • AlFaris, E., Naeem, N., Irfan, F., Qureshi, R., Saad, H., Al Sadhan, R., … Van der Vleuten, C. (2015). A One-Day Dental Faculty Workshop in Writing Multiple-Choice Questions: An Impact Evaluation. Journal of Dental Education, 79(11), 1305–1313.
  • Atılgan, H. (2011). Eğitimde Ölçme ve Değerlendirme (4. Baskı). Ankara: Anı yayıncılık.
  • Brady, A. M. (2005). Assessment of learning with multiple-choice questions. Nurse Education in Practice, 5(4), 238–242. https://doi.org/10.1016/j.nepr.2004.12.005
  • Case, S. M., & Swanson, D. B. (2001). Constructing Written Test Questions For the Basic and Clinical Sciences. National Board of Medical Examiner, (June), 1–181. https://doi.org/10.1111/j.1365-2990.2004.00612.x
  • Clifton, S. L., & Schriner, C. L. (2010). Assessing the quality of multiple-choice test items. Nurse Educator, 35(1), 12–16. https://doi.org/10.1097/NNE.0b013e3181c41fa3
  • D’Sa, J. L., & Visbal-Dionaldo, M. L. (2017). Analysis of Multiple Choice Questions: Item Difficulty, Discrimination Index and Distractor Efficiency. International Journal of Nursing Education, 9(3), 109. https://doi.org/10.5958/0974-9357.2017.00079.4
  • Dascalu, C. G., Enache, A. M., Mavru, R. B., & Zegan, G. (2015). Computer-based MCQ Assessment for Students in Dental Medicine–Advantages and Drawbacks. Procedia - Social and Behavioral Sciences, 187, 22–27. https://doi.org/10.1016/j.sbspro.2015.03.005
  • Dell, K. A., & Wantuch, G. A. (2017). How-to-guide for writing multiple choice questions for the pharmacy instructor. Currents in Pharmacy Teaching and Learning, 9(1), 137–144. https://doi.org/10.1016/j.cptl.2016.08.036
  • Dönder, A., Elaldı, Ş., & Özkaya, Ö. M. (2012). Views of Lecturers on Complementary Methods of Measurement And Evaluation That Can be Used In Basic English Teaching at University Level. Turkish Studies, 7(1), 953–968.
  • Downing, S. M. (2005). The effects of violating standard item writing principles on tests and students: The consequences of using flawed test items on achievement examinations in medical education. Advances in Health Sciences Education, 10(2), 133–143. https://doi.org/10.1007/s10459-004-4019-5
  • Erkuş, A. (2006). Sınıf öğretmenleri için ölçme ve değerlendirme: kavramlar ve uygulamalar. Ankara: Ekinoks Yayınları.
  • Gelbal, S., & Kelecioglu, H. (2007). Teachers’ Proficiency Perceptions of about The Measurement and Evaluation Techniques and The Problems They Confront. Hacettepe University Journal of Education, 33, 135–145.
  • Gupta, P., Meena, P., Khan, A., Malhotra, R., & Singh, T. (2020). Effect of Faculty Training on Quality of Multiple‑Choice Questions. International Journal of Applied and Basic Medical Research, 10, 210–214. https://doi.org/10.4103/ijabmr.IJABMR
  • Hicks, N. A. (2011). Guidelines for identifying and revising culturally biased multiple-choice nursing examination items. Nurse Educator, 36(6), 266–270. https://doi.org/10.1097/NNE.0b013e3182333fd2
  • Hingorjo, M. R., & Jaleel, F. (2012). Analysis of one-best MCQs: The difficulty index, discrimination index and distractor efficiency. Journal of the Pakistan Medical Association, 62(2), 142–147. https://doi.org/10.1080/13658810701731168
  • Linn, R., & Miller, M. (2005). Measurement and assessment in Teaching (9th Editio). Upper Saddle River: NJ: Pearson Education.
  • Macerata, I. M., Costa, L. A., & Lages e Silva, R. L. (2018). Brutalists Walytopias: Operations for a savage clinic. Arquivos Brasileiros de Psicologia, 70(1), 146–160. https://doi.org/10.1016/j.jtumed.2016.08.014
  • Mahjabeen, W., Alam, S., Hassan, U., Zafar, T., Butt, R., Konain, S., & Rizvi, M. (2018). Difficulty Index , Discrimination Index and Distractor Efficiency in Multiple Choice Questions. Annal of PIMS, 12(March).
  • Mehmood, B., Sultan, S., Azhar, M., Shakoor, A., Khan, N. B., & Niazi, Z. (2021). Quality analysis of multiple choice questions. Community Dentistry, 40(4), 236–239.
  • Morrison, S., & Free, K. W. (2001). Writing multiple-choice test items that promote and measure critical thinking. The Journal of Nursing Education, 40(1), 17–24.
  • Nartgün, Z., Çetiner, B., Gömleksiz, M., Bıçak, B., Karaca, E., & Yurdabakan, İ. (2016). Eğitimde Ölçme ve Değerlendirme (4. Baskı; S. Erkan, ed.). Ankara: Nobel Akademik Yayıncılık.
  • Nedeau-Cayo, R., Laughlin, D., Rus, L., & Hall, J. (2013). Assessment of item-writing flaws in multiple-choice questions. Journal for Nurses in Professional Development, 29(2), 52–57. https://doi.org/10.1097/NND.0b013e318286c2f1
  • Nemec, E. C., & Welch, B. (2016). The impact of a faculty development seminar on the quality of multiple-choice questions. Currents in Pharmacy Teaching and Learning, 8(2), 160–163. https://doi.org/10.1016/j.cptl.2015.12.008
  • Przymuszała, P., Piotrowska, K., Lipski, D., Marciniak, R., & Cerbin-Koczorowska, M. (2020). Guidelines on Writing Multiple Choice Questions: A Well-Received and Effective Faculty Development Intervention. SAGE Open, 10(3), 1–12. https://doi.org/10.1177/2158244020947432
  • Quinn, F. (2000). The Principles and Practice of Nurse Education (fourth ed.). Cheltenham: Stanley Thorne (Publishers) Ltd.
  • Race, P., & Brown, S. (2001). The Lecturers Toolkit (second ed.). London.: Kogan Page.
  • Rush, B. R., Rankin, D. C., & White, B. J. (2016). The impact of item-writing flaws and item complexity on examination item difficulty and discrimination value. BMC Medical Education, 16(250), 1–10. https://doi.org/10.1186/s12909-016-0773-3
  • Sahin, H., Atay, Y., Yagdi, T., & Aka, S. A. (2017). Item analyses of 2016 written exam of Turkish Society of Cardiovascular Surgery Proficiency Board. Turkish Journal of Thoracic and Cardiovascular Surgery, 25(4), 600–607. https://doi.org/10.5606/tgkdc.dergisi.2017.14513
  • Semerci, Ç. (2007). A View to the New Primary School Curricula With the Metaphors Relating to “Curriculum Development.” Cumhuriyet University Faculty of Literature Journal of Social Sciences, 31(2), 125–140.
  • Tarrant, M., Knierim, A., Hayes, S. K., & Ware, J. (2006). The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Education in Practice, 6(6), 354–363. https://doi.org/10.1016/j.nepr.2006.07.002
  • Tarrant, M., & Ware, J. (2012). A Framework for improving the quality of multiple-choice assessments. Nurse Educator, 37(3), 98–104. https://doi.org/10.1097/NNE.0b013e31825041d0
  • Tarrant, M., Ware, J., & Mohammed, A. M. (2009). An assessment of functioning and non-functioning distractors in multiple-choice questions: A descriptive analysis. BMC Medical Education, 9(1), 1–8. https://doi.org/10.1186/1472-6920-9-40
  • Topal, K., Aybek, H., Kara Orhan, C., Büke, A., & Aybek, Z. (2008). Item and Test Analysis of Multiple-Choice Exams Which Applied to The First Term Students Of Pamukkale University Medical Faculty In 2006-2007 Academic Year. Pamukkale Medical Journal, 1(3), 120–126.
  • Yeşilyurt, E. (2012). Öğretmen Adaylarının Ölçme Ve Değerlendirme Alanına Ilişkin Genel Yeterlik Algıları. Mustafa Kemal Üniversitesi Sosyal Bilimler Enstitüsü Dergisi, 9(17), 377–395.
There are 35 citations in total.

Details

Primary Language English
Subjects Nursing
Journal Section Articles
Authors

Fatma Orgun 0000-0002-2351-7227

Fahriye Vatan 0000-0002-8044-0908

Nilay Özkütük 0000-0003-1405-4600

Hale Sezer 0000-0003-4199-7727

Meltem Dursun Engin 0000-0002-7284-040X

Project Number 14-HYO-001
Publication Date September 30, 2021
Submission Date June 8, 2021
Published in Issue Year 2021 Volume: 4 Issue: 2

Cite

APA Orgun, F., Vatan, F., Özkütük, N., Sezer, H., et al. (2021). ESTABLISHING A MEASUREMENT AND EVALUATION LABORATORY IN A NURSING FACULTY: A DESCRIPTIVE STUDY. Izmir Democracy University Health Sciences Journal, 4(2), 169-182. https://doi.org/10.52538/iduhes.949794

227151960619606                 19629                   19630 1995319957 

19952  19958  20682 

20686


23848