Derleme
BibTex RIS Kaynak Göster

Principles for Minimizing Errors in Examination Papers and Other Educational Assessment Instruments

Yıl 2021, Cilt: 8 Sayı: 2, 310 - 325, 10.06.2021
https://doi.org/10.21449/ijate.897874

Öz

Errors in examination papers and other assessment instruments can compromise fairness. For example, a history question containing an incorrect historical date could be impossible for students to answer. Incorrect instructions at the start of an examination could lead students to answer the wrong number of questions. As there is little research on this issue within the educational assessment community, we reviewed the literature on minimizing errors in other industries and domains, including aviation, energy, and medicine. We identified generalizable principles and applied them to our context of educational assessment. We argue that since assessment instrument construction is a complex system comprising numerous interacting components, a holistic approach to system improvement is required. Assessment instrument errors stem primarily from human failure. When human failure occurs, it is not good enough to suggest that ‘to err is simply human’. Instead it is necessary to look deeper, evaluating the latent working conditions that underpin the efficacy of procedures, making the human failure more or less likely. Drawing from the aviation industry’s ergonomic SHELLO model, we articulate and explore three of the most critical working conditions that relate to our context: (i) time pressure, (ii) workload and stress, and (iii) wider organizational culture, including good error data collection. We conclude with recommendations for best practice in minimizing errors in assessment instruments. A ‘good’ error culture should be promoted, which avoids blaming individuals. Errors should be acknowledged readily by all, and system owners should take a scientific approach to understanding and learning from them.

Kaynakça

  • Achtenhagen, F. (1994, June). Presentation to Third International Conference of Learning at Work, Milan, Italy.
  • Akinci, B. (2014). Situational Awareness in Construction and Facility Management. Frontiers of Engineering Management, 1(3), 283-289. https://doi.org/10.15302/J-FEM-2014037
  • Aimola Davies, A., Waterman, S., White, R. & Davies, M. (2013). When you fail to see what you were told to look for: Inattentional blindness and task instructions. Consciousness and Cognition, 22(1), 221-230. https://doi.org/10.1016/j.concog.2012.11.015
  • Baddeley, A. (2010). Working memory. Current Biology, 20(4), R136-R140.
  • Baranowski, R. (2006). Item editing and editorial review. In S. Downing & T. Haladyna (Eds.), Handbook of Test Development (pp. 349-357). Lawrence Erlbaum Associates.
  • Battmann, W., & Klumb, P. (1993). Behavioural economics and compliance with safety regulations. Safety Science, 16, 35 46. https://www.sciencedirect.com/science/article/pii/092575359390005X BBC (2017, November 21) NZ minister orders probe into 'impossible' maths exam. https://www.bbc.co.uk/news/blogs-news-from-elsewhere-42065574
  • Bleetman, A., Sanusi, S., Dale, T., & Brace, S. (2012). Human factors and error prevention in emergency medicine. Emergency Medicine Journal, 29, 389 393. http://emj.bmj.com/content/29/5/389.long
  • Borchard, E. M. (2013). Convicting the innocent and state indemnity for errors of criminal justice. The Justice Institute. (Original work published 1932)
  • Bruner, J. S., & Postman, L. (1949). On the perception of incongruity: a paradigm. Journal of Personality, 18(2), 206 23. https://doi.org/10.1111/j.1467 6494.1949.tb01241.x
  • Chang, Y.-H., & Wang, Y.-C. (2010). Significant human risk factors in aircraft maintenance technicians. Safety Science, 48(1), 54 62. https://doi.org/10.1016/j.ssci.2009.05.004
  • Chang, Y.-H., Yang, H.-H., & Hsiao, Y.-J. (2016). Human risk factors associated with pilots in runway excursions. Accident Analysis & Prevention 94, 227-237. https://doi.org/10.1016/j.aap.2016.06.007
  • Civil Aviation Authority (2014). Flight-crew human factors handbook. CAA. http://publicapps.caa.co.uk/docs/33/CAP%20737%20DEC16.pdf
  • Dekker, S. (2002a). The Field Guide to Human Error Investigations. Ashgate.
  • Dekker, S. (2002b). Reconstructing human contributions to accidents: the new view on error and performance. Journal of Safety Research, 33(3), 371-385. https://doi.org/10.1016/S0022-4375(02)00032-4
  • Dekker, S. (2017) Just Culture: Restoring trust and accountability in your organization. CRC Press.
  • Edmundson, A. (1999). Psychological Safety and Learning Behavior in Work Teams. Administrative Science Quarterly, 44(2), 350-383.
  • Edward, E. (1972). Man and machine: systems for safety. Proceedings of the British Airline Pilots Association Technical Symposium, London.
  • Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37(1), 32-64.
  • Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford University Press.
  • Flin, R., O’Connor, P., & Mearns, K. (2002). Crew resource management: improving team work in high reliability industries. Team Performance Management: An International Journal, 8(3/4), 68-78. https://doi.org/10.1108/13527590210433366
  • Förster, J., Higgins, E. T., & Bianco, A. T. (2003). Speed/accuracy decisions in task performance: Built-in trade-off or separate strategic concerns? Organizational Behavior and Human Decision Processes, 90(1), 148 164. https://www.sciencedirect.com/science/article/pii/S0749597802005095
  • Gawande, A. (2011). The checklist manifesto: How to get things right. Metropolitan Books.
  • Harrison, A. (2011, June 9) Students hit by more exam errors. BBC. https://www.bbc.co.uk/news/education-13710868
  • Higgins, E. T. (1997). Beyond pleasure and pain. American Psychologist, 52(12), 1280-1300. https://pdfs.semanticscholar.org/6b64/5e0418ae70e82cc322dd6fbf0647ae2523e4.pdf
  • Innocence Project (2021). The innocence project. https://www.innocenceproject.org
  • International Civil Aviation Organization (1993). Investigation of Human Factors in Accidents and Incidents. Human Factor Digest No.7. https://skybrary.aero/bookshelf/books/2037.pdf
  • Jones, D. G., & Endsley, M. R. (1996). Sources of situation awareness errors in aviation. Aviation, Space, and Environmental Medicine, 67(6), 507-512.
  • Kahneman, D. (2011). Thinking, fast and slow. Penguin Group.
  • Kontogiannis, T., & Malakis, S. (2009). A proactive approach to human error detection and identification in aviation and air traffic control. Safety Science, 47, 693-706.
  • Meredith, R. (2019, May 17) AS-level Economics exam error under investigation in NI. BBC. https://www.bbc.co.uk/news/uk-northern-ireland-48313904
  • Mitleton-Kelly, E. (2003). Ten principles of complexity and enabling infrastructures. In E. Mitleton-Kelly (Ed.), Complex systems and evolutionary perspectives on organisations: the application of complexity theory to organisations. Elsevier.
  • New Straits Times (2015) S. Korea exam chief resigns over errors in high-stakes college test. https://www.nst.com.my/news/2015/09/s-korea-exam-chief-resignsover-errors-high-stakes-college-test
  • Nisbet, I., & Shaw, S. (2020). Is Assessment Fair? Sage.
  • Oates, T. (2017). A Cambridge Approach to improving education. Cambridge Assessment. http://www.cambridgeassessment.org.uk/Images/cambridge approach to improving education.pdf
  • Ofqual (2019). GCSE, AS & A level summer report 2018. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/852440/GQ-Summer-Report-2019-MON1100.pdf
  • Popper, K. (1963). Conjectures and refutations: The growth of scientific knowledge. Routledge and Kegan Paul.
  • Pronovost, P., & Vohr, E. (2011). Safe Patients, Smart Hospitals: How One Doctor's Checklist Can Help Us Change Health Care from the Inside Out. Penguin books.
  • Reason, J. (1990). Human error. Cambridge University Press.
  • Reason, J. (2008). The human contribution. Ashgate.
  • Reason, J. (2013). A life in error: from little slips to big disasters. Ashgate.
  • Richardson, H. (2017, May 26) GCSE exam error: Board accidentally rewrites Shakespeare. BBC. https://www.bbc.co.uk/news/education-40059967
  • Rhoades, K., & Madaus, G. (2003). Errors in standardized tests: A systemic problem. Boston College.
  • Rodriguez, M. (2015). Selected-response item development. In S. Lane, M. Raymond, & T. Haladyna (Eds.), Handbook of Test Development (pp. 259-273). Routledge.
  • Syed, M. (2015). Black box thinking. Marginal gains and the secrets of high performance. John Murray.
  • Verne, J. (1996). Journey to the centre of the earth. Wordsworth Editions Limited. (Original work published 1864.)
  • Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. (1999). Organising for high reliability: Processes of collective mindfulness. In R. S. Sutton & B. M. Staw (Eds.). Research In Organizational Behavior, 21, 23-81.
  • Wickens, C. D. (2008). Situation awareness: Review of Mica Endsley’s 1995 articles on situation awareness theory and measurement. Human Factors, 50(3), 397 403. https://journals.sagepub.com/doi/pdf/10.1518/001872008X288420
  • Wiegmann, D. A., & Shappell, S. A. (2003). A Human Error Approach to Aviation Accident Analysis. Ashgate.

Principles for Minimizing Errors in Examination Papers and Other Educational Assessment Instruments

Yıl 2021, Cilt: 8 Sayı: 2, 310 - 325, 10.06.2021
https://doi.org/10.21449/ijate.897874

Öz

Errors in examination papers and other assessment instruments can compromise fairness. For example, a history question containing an incorrect historical date could be impossible for students to answer. Incorrect instructions at the start of an examination could lead students to answer the wrong number of questions. As there is little research on this issue within the educational assessment community, we reviewed the literature on minimizing errors in other industries and domains, including aviation, energy, and medicine. We identified generalizable principles and applied them to our context of educational assessment. We argue that since assessment instrument construction is a complex system comprising numerous interacting components, a holistic approach to system improvement is required. Assessment instrument errors stem primarily from human failure. When human failure occurs, it is not good enough to suggest that ‘to err is simply human’. Instead it is necessary to look deeper, evaluating the latent working conditions that underpin the efficacy of procedures, making the human failure more or less likely. Drawing from the aviation industry’s ergonomic SHELLO model, we articulate and explore three of the most critical working conditions that relate to our context: (i) time pressure, (ii) workload and stress, and (iii) wider organizational culture, including good error data collection. We conclude with recommendations for best practice in minimizing errors in assessment instruments. A ‘good’ error culture should be promoted, which avoids blaming individuals. Errors should be acknowledged readily by all, and system owners should take a scientific approach to understanding and learning from them.

Kaynakça

  • Achtenhagen, F. (1994, June). Presentation to Third International Conference of Learning at Work, Milan, Italy.
  • Akinci, B. (2014). Situational Awareness in Construction and Facility Management. Frontiers of Engineering Management, 1(3), 283-289. https://doi.org/10.15302/J-FEM-2014037
  • Aimola Davies, A., Waterman, S., White, R. & Davies, M. (2013). When you fail to see what you were told to look for: Inattentional blindness and task instructions. Consciousness and Cognition, 22(1), 221-230. https://doi.org/10.1016/j.concog.2012.11.015
  • Baddeley, A. (2010). Working memory. Current Biology, 20(4), R136-R140.
  • Baranowski, R. (2006). Item editing and editorial review. In S. Downing & T. Haladyna (Eds.), Handbook of Test Development (pp. 349-357). Lawrence Erlbaum Associates.
  • Battmann, W., & Klumb, P. (1993). Behavioural economics and compliance with safety regulations. Safety Science, 16, 35 46. https://www.sciencedirect.com/science/article/pii/092575359390005X BBC (2017, November 21) NZ minister orders probe into 'impossible' maths exam. https://www.bbc.co.uk/news/blogs-news-from-elsewhere-42065574
  • Bleetman, A., Sanusi, S., Dale, T., & Brace, S. (2012). Human factors and error prevention in emergency medicine. Emergency Medicine Journal, 29, 389 393. http://emj.bmj.com/content/29/5/389.long
  • Borchard, E. M. (2013). Convicting the innocent and state indemnity for errors of criminal justice. The Justice Institute. (Original work published 1932)
  • Bruner, J. S., & Postman, L. (1949). On the perception of incongruity: a paradigm. Journal of Personality, 18(2), 206 23. https://doi.org/10.1111/j.1467 6494.1949.tb01241.x
  • Chang, Y.-H., & Wang, Y.-C. (2010). Significant human risk factors in aircraft maintenance technicians. Safety Science, 48(1), 54 62. https://doi.org/10.1016/j.ssci.2009.05.004
  • Chang, Y.-H., Yang, H.-H., & Hsiao, Y.-J. (2016). Human risk factors associated with pilots in runway excursions. Accident Analysis & Prevention 94, 227-237. https://doi.org/10.1016/j.aap.2016.06.007
  • Civil Aviation Authority (2014). Flight-crew human factors handbook. CAA. http://publicapps.caa.co.uk/docs/33/CAP%20737%20DEC16.pdf
  • Dekker, S. (2002a). The Field Guide to Human Error Investigations. Ashgate.
  • Dekker, S. (2002b). Reconstructing human contributions to accidents: the new view on error and performance. Journal of Safety Research, 33(3), 371-385. https://doi.org/10.1016/S0022-4375(02)00032-4
  • Dekker, S. (2017) Just Culture: Restoring trust and accountability in your organization. CRC Press.
  • Edmundson, A. (1999). Psychological Safety and Learning Behavior in Work Teams. Administrative Science Quarterly, 44(2), 350-383.
  • Edward, E. (1972). Man and machine: systems for safety. Proceedings of the British Airline Pilots Association Technical Symposium, London.
  • Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37(1), 32-64.
  • Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford University Press.
  • Flin, R., O’Connor, P., & Mearns, K. (2002). Crew resource management: improving team work in high reliability industries. Team Performance Management: An International Journal, 8(3/4), 68-78. https://doi.org/10.1108/13527590210433366
  • Förster, J., Higgins, E. T., & Bianco, A. T. (2003). Speed/accuracy decisions in task performance: Built-in trade-off or separate strategic concerns? Organizational Behavior and Human Decision Processes, 90(1), 148 164. https://www.sciencedirect.com/science/article/pii/S0749597802005095
  • Gawande, A. (2011). The checklist manifesto: How to get things right. Metropolitan Books.
  • Harrison, A. (2011, June 9) Students hit by more exam errors. BBC. https://www.bbc.co.uk/news/education-13710868
  • Higgins, E. T. (1997). Beyond pleasure and pain. American Psychologist, 52(12), 1280-1300. https://pdfs.semanticscholar.org/6b64/5e0418ae70e82cc322dd6fbf0647ae2523e4.pdf
  • Innocence Project (2021). The innocence project. https://www.innocenceproject.org
  • International Civil Aviation Organization (1993). Investigation of Human Factors in Accidents and Incidents. Human Factor Digest No.7. https://skybrary.aero/bookshelf/books/2037.pdf
  • Jones, D. G., & Endsley, M. R. (1996). Sources of situation awareness errors in aviation. Aviation, Space, and Environmental Medicine, 67(6), 507-512.
  • Kahneman, D. (2011). Thinking, fast and slow. Penguin Group.
  • Kontogiannis, T., & Malakis, S. (2009). A proactive approach to human error detection and identification in aviation and air traffic control. Safety Science, 47, 693-706.
  • Meredith, R. (2019, May 17) AS-level Economics exam error under investigation in NI. BBC. https://www.bbc.co.uk/news/uk-northern-ireland-48313904
  • Mitleton-Kelly, E. (2003). Ten principles of complexity and enabling infrastructures. In E. Mitleton-Kelly (Ed.), Complex systems and evolutionary perspectives on organisations: the application of complexity theory to organisations. Elsevier.
  • New Straits Times (2015) S. Korea exam chief resigns over errors in high-stakes college test. https://www.nst.com.my/news/2015/09/s-korea-exam-chief-resignsover-errors-high-stakes-college-test
  • Nisbet, I., & Shaw, S. (2020). Is Assessment Fair? Sage.
  • Oates, T. (2017). A Cambridge Approach to improving education. Cambridge Assessment. http://www.cambridgeassessment.org.uk/Images/cambridge approach to improving education.pdf
  • Ofqual (2019). GCSE, AS & A level summer report 2018. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/852440/GQ-Summer-Report-2019-MON1100.pdf
  • Popper, K. (1963). Conjectures and refutations: The growth of scientific knowledge. Routledge and Kegan Paul.
  • Pronovost, P., & Vohr, E. (2011). Safe Patients, Smart Hospitals: How One Doctor's Checklist Can Help Us Change Health Care from the Inside Out. Penguin books.
  • Reason, J. (1990). Human error. Cambridge University Press.
  • Reason, J. (2008). The human contribution. Ashgate.
  • Reason, J. (2013). A life in error: from little slips to big disasters. Ashgate.
  • Richardson, H. (2017, May 26) GCSE exam error: Board accidentally rewrites Shakespeare. BBC. https://www.bbc.co.uk/news/education-40059967
  • Rhoades, K., & Madaus, G. (2003). Errors in standardized tests: A systemic problem. Boston College.
  • Rodriguez, M. (2015). Selected-response item development. In S. Lane, M. Raymond, & T. Haladyna (Eds.), Handbook of Test Development (pp. 259-273). Routledge.
  • Syed, M. (2015). Black box thinking. Marginal gains and the secrets of high performance. John Murray.
  • Verne, J. (1996). Journey to the centre of the earth. Wordsworth Editions Limited. (Original work published 1864.)
  • Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. (1999). Organising for high reliability: Processes of collective mindfulness. In R. S. Sutton & B. M. Staw (Eds.). Research In Organizational Behavior, 21, 23-81.
  • Wickens, C. D. (2008). Situation awareness: Review of Mica Endsley’s 1995 articles on situation awareness theory and measurement. Human Factors, 50(3), 397 403. https://journals.sagepub.com/doi/pdf/10.1518/001872008X288420
  • Wiegmann, D. A., & Shappell, S. A. (2003). A Human Error Approach to Aviation Accident Analysis. Ashgate.
Toplam 48 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Eğitim Üzerine Çalışmalar
Bölüm Makaleler
Yazarlar

Irenka Suto Bu kişi benim 0000-0001-6871-901X

Jo Ireland Bu kişi benim 0000-0003-1237-7860

Yayımlanma Tarihi 10 Haziran 2021
Gönderilme Tarihi 3 Aralık 2020
Yayımlandığı Sayı Yıl 2021 Cilt: 8 Sayı: 2

Kaynak Göster

APA Suto, I., & Ireland, J. (2021). Principles for Minimizing Errors in Examination Papers and Other Educational Assessment Instruments. International Journal of Assessment Tools in Education, 8(2), 310-325. https://doi.org/10.21449/ijate.897874

23823             23825             23824