Research Article
BibTex RIS Cite

Student engagement with a novel online assessment strategy

Year 2024, Volume: 11 Issue: 1, 1 - 19, 16.03.2024
https://doi.org/10.21449/ijate.1428025

Abstract

The study investigated the impact on student engagement and achievement of a "formative thresholded" continuous assessment strategy in which students had to meet a modest threshold, but their continuous assessment marks did not contribute to their final grade. Students were free to choose their own blend of tutor-marked and computer-marked assignments provided they met the threshold in a minimum number of assignments. For students reaching this threshold, the final grade was determined solely by an end-of-year examination. This assessment strategy was compared with one in which summative assignments contributed to the final grade alongside the final examination. The methodology took the form of data analysis of assessment records from more than 3500 anonymized students over a period of 7 years. Evidence was found for improved student engagement under the formative thresholded assessment strategy, with students choosing to complete many more computer-marked assignments, balanced by slightly fewer tutor-marked assignments. There was no change in overall retention. Many students made use of the capability to repeat different versions of the computer-marked questions. There was some evidence of students gaming the system (repeating questions until they obtained a version they had seen before), though most students did not do this at all and only a small number did so regularly. Students appeared to make better use of model answers, provided in response to three incorrect tries at a question, than of hints provided after each try. It was noted that the formative thresholded assessment strategy has potential for wider use.

References

  • Adedoyin, O.B., & Soykan, E. (2023). Covid-19 pandemic and online learning: The challenges and opportunities. Interactive Learning Environments, 31(2), 863 875. https://doi.org/10.1080/10494820.2020.1813180
  • Ahsan, K., Akbar, S., & Kam, B. (2022). Contract cheating in higher education: A systematic literature review and future research agenda. Assessment & Evaluation in Higher Education, 47(4), 523-539. https://doi.org/10.1080/02602938.2021.1931660
  • Ashworth, P., Bannister, P., & Thorne, P. (1997). Guilty in whose eyes? University students' perceptions of cheating and plagiarism in academic work and assessment. Studies in Higher Education, 22(2), 187-203. https://doi.org/10.1080/03075079712331381034
  • Barthakur, A., Joksimovic, S., Kovanovic, V., Richey, M., & Pardo, A. (2022). Aligning objectives with assessment in online courses: Integrating learning analytics and measurement theory. Computers & Education, 190, 104603. https://doi.org/10.1016/j.compedu.2022.104603
  • Benson, R., & Brack, C. (2010). Online learning and assessment in higher education: A planning guide. Chandos Publishing.
  • Bolton, J. (2010a). CHASE: Prior course grades as indicators of success and failure in SM358. In J. Bolton (Ed.), e-Tutorials and Learner Support (pp. 124-140). The Physics Innovations Centre for Excellence in Teaching and Learning. https://www5.open.ac.uk/scholarship and innovation/esteem/resources/picetl book reports e tutorials and learner-support-2010
  • Bolton, J. (2010b). Using interactive computer-marked assignments in level 3 quantum mechanics. In S. Jordan (Ed.), e-Assessment (pp. 82-87). The Physics Innovations Centre for Excellence in Teaching and Learning. https://www5.open.ac.uk/scholarship and innovation/esteem/resources/picetl-book-reports-e-assessment-2010
  • Bond, M., & Bedenlier, S. (2019). Facilitating student engagement through educational technology: Towards a conceptual framework. Journal of Interactive Media in Education, 2019(1), 1-14. https://doi.org/10.5334/jime.528
  • Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: A systematic evidence map. International Journal of Educational Technology in Higher Education, 17(1), Article 2. https://doi.org/10.1186/s41239-019-0176-8
  • Boud, D. (2000). Sustainable assessment: Rethinking assessment for the learning society. Studies in Continuing Education, 22(2), 151-167. https://doi.org/10.1080/713695728
  • Boud, D., & Soler, R. (2016). Sustainable assessment revisited. Assessment & Evaluation in Higher Education, 41(3), 400-413. https://doi.org/10.1080/02602938.2015.1018133
  • Burrows, S., Gurevych, I., & Stein, B. (2015). The eras and trends of automatic short answer grading. International Journal of Artificial Intelligence in Education, 25, 60 117. https://doi.org/10.1007/s40593-014-0026-8
  • Chen, P.-S.D., Lambert, A.D., & Guidry, K.R. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education, 54(4), 1222-1232. https://doi.org/10.1016/j.compedu.2009.11.008
  • del Gobbo, E., Guarino, A., Cafarelli, B., Grilli, L., & Limone, P. (2023). Automatic evaluation of open-ended questions for online learning. A systematic mapping. Studies in Educational Evaluation, 77, 101258. https://doi.org/10.1016/j.stueduc.2023.101258
  • Englund, C., Olofsson, A.D., & Price, L. (2017). Teaching with technology in higher education: Understanding conceptual change and development in practice. Higher Education Research and Development, 36(1), 73 87. https://doi.org/10.1080/07294360.2016.1171300
  • Farrelly, T., & Baker, N. (2023). Generative artificial intelligence: Implications and considerations for higher education practice. Education Sciences, 13(11), 1109. https://doi.org/10.3390/educsci13111109
  • Finn, J.D., & Zimmer, K.S. (2012). Student engagement: What is it? Why does it matter?. In S. Christenson, A.L. Reschly, C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 97-131). Springer.
  • Ghabraie, K. (2020). Computer-marked assessments to enhance learning in engineering education. International Journal on Innovations in Online Education, 4(1). https://doi.org/10.1615/IntJInnovOnlineEdu.2020033192
  • Gibbs, G., & Simpson, C. (2005). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, 1, 3 31. http://eprints.glos.ac.uk/3609/
  • Guo, Y., Liu, Z., Meng, X., & Yin, H. (2023). Unravelling the relationship between student engagement and learning outcomes in emergency online learning: A synthesis of quantitative and qualitative results. Assessment & Evaluation in Higher Education, 48(8),1325-1338. https://doi.org/10.1080/02602938.2023.2214345
  • Holmes, N. (2015). Student perceptions of their learning and engagement in response to the use of a continuous e-assessment in an undergraduate module. Assessment & Evaluation in Higher Education, 40(1), 1-14. https://doi.org/10.1080/02602938.2014.881978
  • Holmes, N. (2018). Engaging with assessment: Increasing student engagement through continuous assessment. Active Learning in Higher Education, 19(1), 23 34. https://doi.org/10.1177/1469787417723230
  • Jordan, S. (2011). Using interactive computer‐based assessment to support beginning distance learners of science. Open Learning: The Journal of Open, Distance and e-Learning, 26(2), 147-164. https://doi.org/10.1080/02680513.2011.567754
  • Jordan, S. (2014). Using e-assessment to learn about students and learning. International Journal of eAssessment, 4(1), 1-16. https://oro.open.ac.uk/40579
  • Jordan, S. (2023). Computer-marked assessment and concept inventories. In A.K. Wood (Ed.), Effective Teaching in Large STEM Classes (pp. 6 1 to 6 20). IOP Publishing. https://doi.org/10.1088/978-0-7503-5231-4ch6
  • Kahu, E.R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38(5), 758-773. https://doi.org/10.1080/03075079.2011.598505
  • Kibble, J. (2007). Use of unsupervised online quizzes as formative assessment in a medical physiology course: Effects of incentives on student participation and performance. Advances in Physiology Education, 31(3), 253 260. https://doi.org/10.1152/advan.00027.2007
  • Kirkwood, A. (2009). E-learning: You don’t always get what you hope for. Technology, Pedagogy and Education, 18(2), 107 121. https://doi.org/10.1080/14759390902992576
  • Kuh, G.D. (2009). The National Survey of Student Engagement: Conceptual and empirical foundations. New Directions for Institutional Research, 141, 5 20. https://doi.org/10.1002/ir.283
  • Kuh, G.D., Cruce, T.M., Shoup, R., Kinzie, J., & Gonyea, R.M. (2008). Unmasking the effects of student engagement on first-year college grades and persistence. The Journal of Higher Education, 79(5), 540-563. https://doi.org/10.1080/00221546.2008.11772116
  • Kumar, D. & Sajja, R. (2020). Qualifying online assessment during COVID-19 pandemic: Reflecting on our experience under the cognitive lens of Miller's pyramid. Research and Development in Medical Education, 9(1), 1 2. https://rdme.tbzmed.ac.ir/Article/rdme 31855
  • Lau, A.M.S. (2016). “Formative good, summative bad?”: A review of the dichotomy in assessment literature. Journal of Further and Higher Education, 40(4), 509 525. https://doi.org/10.1080/0309877X.2014.984600
  • Lefevre, D., & Cox, B. (2017). Delayed instructional feedback may be more effective, but is this contrary to learners' preferences? British Journal of Educational Technology, 48(6), 1357-1367. https://doi.org/10.1111/bjet.12495
  • Lipnevich, A.A., & Smith, J.K. (2009). “I really need feedback to learn”: Students’ perspectives on the effectiveness of the differential feedback messages. Educational Assessment, Evaluation and Accountability, 21(4), 347 367. https://link.springer.com/content/pdf/10.1007/s11092-009-9082-2.pdf
  • Ma, J., Han, X., Yang, J., & Cheng, J. (2015). Examining the necessary condition for engagement in an online learning environment based on learning analytics approach: The role of the instructor. The Internet and Higher Education, 24, 26 34. https://doi.org/10.1016/j.iheduc.2014.09.005
  • McCallum, S., & Milner, M.M. (2021). The effectiveness of formative assessment: student views and staff reflections. Assessment & Evaluation in Higher Education, 46(1), 1-16. https://doi.org/10.1080/02602938.2020.1754761
  • Miller, T. (2009). Formative computer‐based assessment in higher education: The effectiveness of feedback in supporting student learning. Assessment & Evaluation in Higher Education, 34(2), 181-192. https://doi.org/10.1080/02602930801956075
  • Morris, R., Perry, T., & Wardle, L. (2021). Formative assessment and feedback for learning in higher education: A systematic review. Review of Education, 9(3), e3292. https://doi.org/10.1002/rev3.3292
  • Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior, 76, 703 714. https://doi.org/10.1016/j.chb.2017.03.028
  • Nicol, D.J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218. https://doi.org/10.1080/03075070600572090
  • Ramesh, D., & Sanampudi, S.K. (2022). An automated essay scoring systems: A systematic literature review. Artificial Intelligence Review, 55(3), 2495 2527. https://doi.org/10.1007/s10462-021-10068-2
  • Rashid, T., & Asghar, H.M. (2016). Technology use, self-directed learning, student engagement and academic performance: Examining the interrelations. Computers in Human Behavior, 63, 604-612. https://doi.org/10.1016/j.chb.2016.05.084
  • Riegel, K., & Evans, T. (2021). Student achievement emotions: Examining the role of frequent online assessment. Australasian Journal of Educational Technology, 37(6), 75 87. https://doi.org/10.14742/ajet.6516
  • Roediger, H.L., & Karpicke, J.D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1(3), 181-210. https://doi.org/10.1111/j.1745-6916.2006.00012.x
  • Roediger, H.L., & Karpicke, J.D. (2018). Reflections on the resurgence of interest in the testing effect. Perspectives on Psychological Science, 13(2), 236 241. https://doi.org/10.1177/1745691617718873
  • Rolim, C., & Isaias, P. (2019). Examining the use of e‐assessment in higher education: Teachers’ and students’ viewpoints. British Journal of Educational Technology, 50(4), 1785-1800. https://doi.org/10.1111/bjet.12669
  • Ross, S., Jordan, S., & Butcher, P. (2006). Online instantaneous and targeted feedback for remote learners. In C. Bryan, & K. Clegg (Eds.), Innovative Assessment in Higher Education (pp. 123-131). Routledge.
  • Sangwin, C.J. (2013). Computer Aided Assessment of Mathematics. Oxford University Press.
  • Sangwin, C.J., & Harjula, M. (2017). Online assessment of dimensional numerical answers using STACK in science. European Journal of Physics, 38(3), 035701. https://iopscience.iop.org/article/10.1088/1361-6404/aa5e9d/
  • Şenel, S., & Şenel, H. (2021). Remote assessment in higher education during COVID-19 pandemic. International Journal of Assessment Tools in Education, 8(2), 181 199. https://doi.org/10.21449/ijate.820140
  • Sevnarayan, K., & Bridget Maphoto, K. (2024). Exploring the dark side of online distance learning: Cheating behaviours, contributing factors, and strategies to enhance the integrity of online assessment. Journal of Academic Ethics, 1-20. https://doi.org/10.1007/s10805-023-09501-8
  • Sim, G., Holifield, P., & Brown, M. (2004). Implementation of computer assisted assessment: Lessons from the literature. ALT J, 12(3), 215 229. https://doi.org/10.3402/rlt.v12i3.11255
  • St‐Onge, C., Ouellet, K., Lakhal, S., Dubé, T., & Marceau, M. (2022). COVID‐19 as the tipping point for integrating e‐assessment in higher education practices. British Journal of Educational Technology, 53(2), 349-366. https://doi.org/10.1111/bjet.13169
  • Süzen, N., Gorban, A.N., Levesley, J., & Mirkes, E.M. (2020). Automatic short answer grading and feedback using text mining methods. Procedia Computer Science, 169, 726-743. https://doi.org/10.1016/j.procs.2020.02.171
  • Winstone, N.E., & Boud, D. (2022). The need to disentangle assessment and feedback in higher education. Studies in Higher Education, 47(3), 656 667. https://doi.org/10.1080/03075079.2020.1779687
  • Yang, B.W., Razo, J., & Persky, A.M. (2019). Using testing as a learning tool. American Journal of Pharmaceutical Education, 83(9), Article 7324. https://doi.org/10.5688/ajpe7324
  • Yang, L.P., & Xin, T. (2022). Changing educational ssessments in the post‐COVID‐19 era: From assessment of learning (AoL) to assessment as learning (AaL). Educational Measurement: Issues and Practice, 41(1), 54-60. https://doi.org/10.1111/emip.12492

Student engagement with a novel online assessment strategy

Year 2024, Volume: 11 Issue: 1, 1 - 19, 16.03.2024
https://doi.org/10.21449/ijate.1428025

Abstract

The study investigated the impact on student engagement and achievement of a "formative thresholded" continuous assessment strategy in which students had to meet a modest threshold, but their continuous assessment marks did not contribute to their final grade. Students were free to choose their own blend of tutor-marked and computer-marked assignments provided they met the threshold in a minimum number of assignments. For students reaching this threshold, the final grade was determined solely by an end-of-year examination. This assessment strategy was compared with one in which summative assignments contributed to the final grade alongside the final examination. The methodology took the form of data analysis of assessment records from more than 3500 anonymized students over a period of 7 years. Evidence was found for improved student engagement under the formative thresholded assessment strategy, with students choosing to complete many more computer-marked assignments, balanced by slightly fewer tutor-marked assignments. There was no change in overall retention. Many students made use of the capability to repeat different versions of the computer-marked questions. There was some evidence of students gaming the system (repeating questions until they obtained a version they had seen before), though most students did not do this at all and only a small number did so regularly. Students appeared to make better use of model answers, provided in response to three incorrect tries at a question, than of hints provided after each try. It was noted that the formative thresholded assessment strategy has potential for wider use.

References

  • Adedoyin, O.B., & Soykan, E. (2023). Covid-19 pandemic and online learning: The challenges and opportunities. Interactive Learning Environments, 31(2), 863 875. https://doi.org/10.1080/10494820.2020.1813180
  • Ahsan, K., Akbar, S., & Kam, B. (2022). Contract cheating in higher education: A systematic literature review and future research agenda. Assessment & Evaluation in Higher Education, 47(4), 523-539. https://doi.org/10.1080/02602938.2021.1931660
  • Ashworth, P., Bannister, P., & Thorne, P. (1997). Guilty in whose eyes? University students' perceptions of cheating and plagiarism in academic work and assessment. Studies in Higher Education, 22(2), 187-203. https://doi.org/10.1080/03075079712331381034
  • Barthakur, A., Joksimovic, S., Kovanovic, V., Richey, M., & Pardo, A. (2022). Aligning objectives with assessment in online courses: Integrating learning analytics and measurement theory. Computers & Education, 190, 104603. https://doi.org/10.1016/j.compedu.2022.104603
  • Benson, R., & Brack, C. (2010). Online learning and assessment in higher education: A planning guide. Chandos Publishing.
  • Bolton, J. (2010a). CHASE: Prior course grades as indicators of success and failure in SM358. In J. Bolton (Ed.), e-Tutorials and Learner Support (pp. 124-140). The Physics Innovations Centre for Excellence in Teaching and Learning. https://www5.open.ac.uk/scholarship and innovation/esteem/resources/picetl book reports e tutorials and learner-support-2010
  • Bolton, J. (2010b). Using interactive computer-marked assignments in level 3 quantum mechanics. In S. Jordan (Ed.), e-Assessment (pp. 82-87). The Physics Innovations Centre for Excellence in Teaching and Learning. https://www5.open.ac.uk/scholarship and innovation/esteem/resources/picetl-book-reports-e-assessment-2010
  • Bond, M., & Bedenlier, S. (2019). Facilitating student engagement through educational technology: Towards a conceptual framework. Journal of Interactive Media in Education, 2019(1), 1-14. https://doi.org/10.5334/jime.528
  • Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: A systematic evidence map. International Journal of Educational Technology in Higher Education, 17(1), Article 2. https://doi.org/10.1186/s41239-019-0176-8
  • Boud, D. (2000). Sustainable assessment: Rethinking assessment for the learning society. Studies in Continuing Education, 22(2), 151-167. https://doi.org/10.1080/713695728
  • Boud, D., & Soler, R. (2016). Sustainable assessment revisited. Assessment & Evaluation in Higher Education, 41(3), 400-413. https://doi.org/10.1080/02602938.2015.1018133
  • Burrows, S., Gurevych, I., & Stein, B. (2015). The eras and trends of automatic short answer grading. International Journal of Artificial Intelligence in Education, 25, 60 117. https://doi.org/10.1007/s40593-014-0026-8
  • Chen, P.-S.D., Lambert, A.D., & Guidry, K.R. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education, 54(4), 1222-1232. https://doi.org/10.1016/j.compedu.2009.11.008
  • del Gobbo, E., Guarino, A., Cafarelli, B., Grilli, L., & Limone, P. (2023). Automatic evaluation of open-ended questions for online learning. A systematic mapping. Studies in Educational Evaluation, 77, 101258. https://doi.org/10.1016/j.stueduc.2023.101258
  • Englund, C., Olofsson, A.D., & Price, L. (2017). Teaching with technology in higher education: Understanding conceptual change and development in practice. Higher Education Research and Development, 36(1), 73 87. https://doi.org/10.1080/07294360.2016.1171300
  • Farrelly, T., & Baker, N. (2023). Generative artificial intelligence: Implications and considerations for higher education practice. Education Sciences, 13(11), 1109. https://doi.org/10.3390/educsci13111109
  • Finn, J.D., & Zimmer, K.S. (2012). Student engagement: What is it? Why does it matter?. In S. Christenson, A.L. Reschly, C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 97-131). Springer.
  • Ghabraie, K. (2020). Computer-marked assessments to enhance learning in engineering education. International Journal on Innovations in Online Education, 4(1). https://doi.org/10.1615/IntJInnovOnlineEdu.2020033192
  • Gibbs, G., & Simpson, C. (2005). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, 1, 3 31. http://eprints.glos.ac.uk/3609/
  • Guo, Y., Liu, Z., Meng, X., & Yin, H. (2023). Unravelling the relationship between student engagement and learning outcomes in emergency online learning: A synthesis of quantitative and qualitative results. Assessment & Evaluation in Higher Education, 48(8),1325-1338. https://doi.org/10.1080/02602938.2023.2214345
  • Holmes, N. (2015). Student perceptions of their learning and engagement in response to the use of a continuous e-assessment in an undergraduate module. Assessment & Evaluation in Higher Education, 40(1), 1-14. https://doi.org/10.1080/02602938.2014.881978
  • Holmes, N. (2018). Engaging with assessment: Increasing student engagement through continuous assessment. Active Learning in Higher Education, 19(1), 23 34. https://doi.org/10.1177/1469787417723230
  • Jordan, S. (2011). Using interactive computer‐based assessment to support beginning distance learners of science. Open Learning: The Journal of Open, Distance and e-Learning, 26(2), 147-164. https://doi.org/10.1080/02680513.2011.567754
  • Jordan, S. (2014). Using e-assessment to learn about students and learning. International Journal of eAssessment, 4(1), 1-16. https://oro.open.ac.uk/40579
  • Jordan, S. (2023). Computer-marked assessment and concept inventories. In A.K. Wood (Ed.), Effective Teaching in Large STEM Classes (pp. 6 1 to 6 20). IOP Publishing. https://doi.org/10.1088/978-0-7503-5231-4ch6
  • Kahu, E.R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38(5), 758-773. https://doi.org/10.1080/03075079.2011.598505
  • Kibble, J. (2007). Use of unsupervised online quizzes as formative assessment in a medical physiology course: Effects of incentives on student participation and performance. Advances in Physiology Education, 31(3), 253 260. https://doi.org/10.1152/advan.00027.2007
  • Kirkwood, A. (2009). E-learning: You don’t always get what you hope for. Technology, Pedagogy and Education, 18(2), 107 121. https://doi.org/10.1080/14759390902992576
  • Kuh, G.D. (2009). The National Survey of Student Engagement: Conceptual and empirical foundations. New Directions for Institutional Research, 141, 5 20. https://doi.org/10.1002/ir.283
  • Kuh, G.D., Cruce, T.M., Shoup, R., Kinzie, J., & Gonyea, R.M. (2008). Unmasking the effects of student engagement on first-year college grades and persistence. The Journal of Higher Education, 79(5), 540-563. https://doi.org/10.1080/00221546.2008.11772116
  • Kumar, D. & Sajja, R. (2020). Qualifying online assessment during COVID-19 pandemic: Reflecting on our experience under the cognitive lens of Miller's pyramid. Research and Development in Medical Education, 9(1), 1 2. https://rdme.tbzmed.ac.ir/Article/rdme 31855
  • Lau, A.M.S. (2016). “Formative good, summative bad?”: A review of the dichotomy in assessment literature. Journal of Further and Higher Education, 40(4), 509 525. https://doi.org/10.1080/0309877X.2014.984600
  • Lefevre, D., & Cox, B. (2017). Delayed instructional feedback may be more effective, but is this contrary to learners' preferences? British Journal of Educational Technology, 48(6), 1357-1367. https://doi.org/10.1111/bjet.12495
  • Lipnevich, A.A., & Smith, J.K. (2009). “I really need feedback to learn”: Students’ perspectives on the effectiveness of the differential feedback messages. Educational Assessment, Evaluation and Accountability, 21(4), 347 367. https://link.springer.com/content/pdf/10.1007/s11092-009-9082-2.pdf
  • Ma, J., Han, X., Yang, J., & Cheng, J. (2015). Examining the necessary condition for engagement in an online learning environment based on learning analytics approach: The role of the instructor. The Internet and Higher Education, 24, 26 34. https://doi.org/10.1016/j.iheduc.2014.09.005
  • McCallum, S., & Milner, M.M. (2021). The effectiveness of formative assessment: student views and staff reflections. Assessment & Evaluation in Higher Education, 46(1), 1-16. https://doi.org/10.1080/02602938.2020.1754761
  • Miller, T. (2009). Formative computer‐based assessment in higher education: The effectiveness of feedback in supporting student learning. Assessment & Evaluation in Higher Education, 34(2), 181-192. https://doi.org/10.1080/02602930801956075
  • Morris, R., Perry, T., & Wardle, L. (2021). Formative assessment and feedback for learning in higher education: A systematic review. Review of Education, 9(3), e3292. https://doi.org/10.1002/rev3.3292
  • Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior, 76, 703 714. https://doi.org/10.1016/j.chb.2017.03.028
  • Nicol, D.J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218. https://doi.org/10.1080/03075070600572090
  • Ramesh, D., & Sanampudi, S.K. (2022). An automated essay scoring systems: A systematic literature review. Artificial Intelligence Review, 55(3), 2495 2527. https://doi.org/10.1007/s10462-021-10068-2
  • Rashid, T., & Asghar, H.M. (2016). Technology use, self-directed learning, student engagement and academic performance: Examining the interrelations. Computers in Human Behavior, 63, 604-612. https://doi.org/10.1016/j.chb.2016.05.084
  • Riegel, K., & Evans, T. (2021). Student achievement emotions: Examining the role of frequent online assessment. Australasian Journal of Educational Technology, 37(6), 75 87. https://doi.org/10.14742/ajet.6516
  • Roediger, H.L., & Karpicke, J.D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1(3), 181-210. https://doi.org/10.1111/j.1745-6916.2006.00012.x
  • Roediger, H.L., & Karpicke, J.D. (2018). Reflections on the resurgence of interest in the testing effect. Perspectives on Psychological Science, 13(2), 236 241. https://doi.org/10.1177/1745691617718873
  • Rolim, C., & Isaias, P. (2019). Examining the use of e‐assessment in higher education: Teachers’ and students’ viewpoints. British Journal of Educational Technology, 50(4), 1785-1800. https://doi.org/10.1111/bjet.12669
  • Ross, S., Jordan, S., & Butcher, P. (2006). Online instantaneous and targeted feedback for remote learners. In C. Bryan, & K. Clegg (Eds.), Innovative Assessment in Higher Education (pp. 123-131). Routledge.
  • Sangwin, C.J. (2013). Computer Aided Assessment of Mathematics. Oxford University Press.
  • Sangwin, C.J., & Harjula, M. (2017). Online assessment of dimensional numerical answers using STACK in science. European Journal of Physics, 38(3), 035701. https://iopscience.iop.org/article/10.1088/1361-6404/aa5e9d/
  • Şenel, S., & Şenel, H. (2021). Remote assessment in higher education during COVID-19 pandemic. International Journal of Assessment Tools in Education, 8(2), 181 199. https://doi.org/10.21449/ijate.820140
  • Sevnarayan, K., & Bridget Maphoto, K. (2024). Exploring the dark side of online distance learning: Cheating behaviours, contributing factors, and strategies to enhance the integrity of online assessment. Journal of Academic Ethics, 1-20. https://doi.org/10.1007/s10805-023-09501-8
  • Sim, G., Holifield, P., & Brown, M. (2004). Implementation of computer assisted assessment: Lessons from the literature. ALT J, 12(3), 215 229. https://doi.org/10.3402/rlt.v12i3.11255
  • St‐Onge, C., Ouellet, K., Lakhal, S., Dubé, T., & Marceau, M. (2022). COVID‐19 as the tipping point for integrating e‐assessment in higher education practices. British Journal of Educational Technology, 53(2), 349-366. https://doi.org/10.1111/bjet.13169
  • Süzen, N., Gorban, A.N., Levesley, J., & Mirkes, E.M. (2020). Automatic short answer grading and feedback using text mining methods. Procedia Computer Science, 169, 726-743. https://doi.org/10.1016/j.procs.2020.02.171
  • Winstone, N.E., & Boud, D. (2022). The need to disentangle assessment and feedback in higher education. Studies in Higher Education, 47(3), 656 667. https://doi.org/10.1080/03075079.2020.1779687
  • Yang, B.W., Razo, J., & Persky, A.M. (2019). Using testing as a learning tool. American Journal of Pharmaceutical Education, 83(9), Article 7324. https://doi.org/10.5688/ajpe7324
  • Yang, L.P., & Xin, T. (2022). Changing educational ssessments in the post‐COVID‐19 era: From assessment of learning (AoL) to assessment as learning (AaL). Educational Measurement: Issues and Practice, 41(1), 54-60. https://doi.org/10.1111/emip.12492
There are 57 citations in total.

Details

Primary Language English
Subjects Classroom Measurement Practices
Journal Section Articles
Authors

Sally Jordan 0000-0003-0770-1443

John P.r. Bolton This is me 0009-0004-7191-6283

Early Pub Date March 13, 2024
Publication Date March 16, 2024
Submission Date August 21, 2021
Published in Issue Year 2024 Volume: 11 Issue: 1

Cite

APA Jordan, S., & Bolton, J. P. (2024). Student engagement with a novel online assessment strategy. International Journal of Assessment Tools in Education, 11(1), 1-19. https://doi.org/10.21449/ijate.1428025

23824         23823             23825