Research Article
BibTex RIS Cite

Effect of random item ordering in multiple choice tests on the academic achievement of online learners

Year 2025, Volume: 15 Issue: 2, 1254 - 1266, 29.06.2025
https://doi.org/10.48146/odusobiad.1599977

Abstract

This study examined the effect of randomly ordered multiple-choice test forms on students' academic achievement. This study was carried out with a true experimental design. All students participating in the study received their training through online learning. The study group of the research consists of 2932 freshman university students studying at different faculties of a state university in the Fall Semester of the 2023-2024 academic year. A 20-item multiple-choice test was used to measure the student's academic achievement. Four different test forms were generated by randomly ordering the items. One-way ANOVA was used to test whether there was a significant difference between the mean scores of the students who took different test forms. Additionally, test statistics of different test forms were examined. The findings showed no significant difference between the students' mean scores in different test forms (1) and that the score distributions were balanced (2). These findings indicate that the different test forms generated by random ordering are equivalent. The similarity of the test statistics of the different test forms also supports these findings. As a result of this study, different test forms were created through random ordering do not affect students' academic achievements. This study suggests educators, test developers, and administrators create different test forms through randomization as a cheating prevention method in multiple-choice tests.

References

  • Aamodt, M. G., & McShane, T. (1992). A meta-analytic investigation of the effect of various test item characteristics on test scores and test completion times. Public Personnel Management, 21(2), 151-160. https://doi.org/10.1177/009102609202100203
  • Baffoe, J., Asamoah, D., Shahrill, M., Latif, S. N. A., Asamoah Gyimah, K., & Anane, E. (2024, April). Does the sequence of items influence secondary school students’ performance in mathematics and science?. In AIP Conference Proceedings (Vol. 3052, No. 1). AIP Publishing. https://doi.org/10.1063/5.0202870
  • Baghaei, P., & Amrahi, N. (2011). The effects of the number of options on the psychometric characteristics of multiple choice items. Psychological Test and Assessment Modeling, 53(2), 192-211. https://www.psychologie-aktuell.com/fileadmin/download/ptam/2-2011_20110622/02_Baghaei.pdf
  • Balch, W. R. (1989). Item order affects performance on multiplechoice exams. Teaching of Psychology, 16(2), 75–77. https://doi.org/10.1207/s15328023top1602_9
  • Baldwin, B. A., & Howard, T. P. (1983). Intertopical sequencing of examination questions: An evaluation. Journal of Accounting Education, 1(1), 89–95. https://doi.org/10.1016/0748-5751(83)90010-6
  • Bard, G., & Weinstein, Y. (2017). The effect of question order on evaluations of test performance: Can the bias dissolve? Quarterly Journal of Experimental Psychology, 70(10), 2130-2140. https://doi.org/10.1080/17470218.2016.1225108
  • Başol, G. (2018). Measurement and evaluation in education. Pegem Akademi. https://doi.org/10.14527/9786053645887
  • Biesta, G. (2009). Good education in an age of measurement: On the need to reconnect with the question of purpose in education. Educational Assessment, Evaluation and Accountability, 21, 33-46. https://doi.org/10.1007/s11092-008-9064-9
  • Borsboom, D., & Molenaar, D. (2015). Psychometrics. In James D. Wright (Ed.), International Encyclopedia of the Social & Behavioral Sciences (Second Edition, pp. 418-422). Elsevier Ltd. https://doi.org/10.1016/B978-0-08-097086-8.43079-5
  • Butler, A. C. (2018). Multiple-choice testing in education: Are the best practices for assessment also good for learning?. Journal of Applied Research in Memory and Cognition, 7(3), 323-331. https://doi.org/10.1016/j.jarmac.2018.07.002
  • Canlar, M., & Jackson, W. K. (1991). Alternative test question sequencing in introductory financial accounting. Journal of Education for Business, 67(2), 116-119. https://doi.org/10.1080/08832323.1991.10117529
  • Carlson, J. L., & Ostrosky, A. L. (1992). Item sequence and student performance on multiple-choice exams: Further evidence. The Journal of Economic Education, 23(3), 232–235. https://doi.org/10.1080/00220485.1992.10844757
  • Carnegie, J. A. (2017). Does correct answer distribution influence student choices when writing multiple choice examinations? Canadian Journal for the Scholarship of Teaching and Learning, 8(1), 11. http://ir.lib.uwo.ca/cjsotl_rcacea/vol8/iss1/1
  • Cook, D. A., & Beckman, T. J. (2006). Current concepts in validity and reliability for psychometric instruments: Theory and application. The American Journal of Medicine, 119(2), 166-e7. https://doi.org/10.1016/j.amjmed.2005.10.036
  • Cronbach, L. J. (1950). Further evidence on response sets and test design. Educational and Psychological Measurement, 10(1), 3-31. https://doi.org/10.1177/00131644500100010
  • Davis, D. B. (2017). Exam question sequencing effects and context cues. Teaching of Psychology, 44(3), 263-267. https://doi.org/10.1177/009862831771275
  • Doğan Gül, Ç., & Çokluk Bökeoğlu, Ö. (2018). The comparison of academic success of students with low and high anxiety levels in tests varying in item difficulty. Inonu University Journal of the Faculty of Education, 19(3), 252-265. https://doi.org/10.17679/inuefd.341477
  • Ertmer, P. A., & Newby, T. J. (2013). Behaviorism, cognitivism, constructivism: Comparing critical features from an instructional design perspective. Performance Improvement Quarterly, 26(2), 43-71. https://doi.org/10.1002/piq.21143
  • Fraenkel, J. R., & Wallen, N. E. (2012). How to design and evaluate research in education (7th ed.). McGraw-Hill.
  • Frederiksen, J. R., White, B. Y., & Gutwill, J. (1999). Dynamic mental models in learning science: The importance of constructing derivational linkages among models. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 36(7), 806-836. https://doi.org/10.1002/(SICI)1098-2736(199909)36:7<806::AID-TEA5>3.0.CO;2-2
  • Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers and Education, 57(4), 2333–2351. https://doi.org/10.1016/j.compedu.2011.06.004
  • Good, V. C. (1973). Dictionary of education. N.Y. : McGraw Hill Book Company
  • Gruber, R. A. (1987). Sequencing exam questions relative to topic presentation. Journal of Accounting Education, 5, 77–86. https://doi.org/10.1016/0748-5751(87)90039-X
  • Green, B. F. (1981). A primer of testing. American Psychologist, 36(10), 1001-1011. https://doi.org/10.1037/0003-066X.36.10.1001
  • Gyamfi, A. (2022). Controlling examination malpractice in Senior High Schools in Ghana through performance-based assessment. Journal of Advances in Education and Philosophy, 6(3), 203-211. https://doi.org/10.36348/jaep.2022.v06i04.002
  • Gyamfi, A., Acquaye, R., & Adjei, C. (2023). Multiple-Choice Items should be sequenced in order of difficulty with the easiest ones placed first. Does it really affect performance? Research Square. https://doi.org/10.21203/rs.3.rs-2882983/v1
  • Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2013). Multivariate data analysis. Pearson Education Limited.
  • Hambleton, R. K., & Traub, R. E. (1974). The effects of item order on test performance and stress. The Journal of Experimental Education, 43(1), 40-46. https://doi.org/10.1080/00220973.1974.10806302
  • Hauck, K. B., Mingo, M. A., & Williams, R. L. (2017). A review of relationships between item sequence and performance on multiple-choice exams. Scholarship of Teaching and Learning in Psychology, 3(1), 58–75. https://doi.org/10.1037/stl0000077
  • Hodson, D. (1984). Some effects of changes in question structure and sequence on performance in a multiple choice chemistry test. Research in Science & Technological Education, 2(2), 177–185. https://doi.org/10.1080/0263514840020209
  • Howard, J. M., & Scott, A. (2017). Any time, any place, flexible pace: Technology-enhanced language learning in a teacher education programme. Australian Journal of Teacher Education (Online), 42(6), 51-68. https://doi.org/10.14221/ajte.2017v42n6.4
  • Kalaycı, Ş. (2008). Spss uygulamalı çok değişkenli istatistik teknikleri. Ankara: Asil Yayın Dağıtım.
  • Kaplan, R. M., & Saccuzzo, D. P. (2001). Psychological testing: Principles, applications, and issues (5th ed.). Wadsworth/Thomson Learning.
  • Kirschner, P. A. (2002). Cognitive load theory: Implications of cognitive load theory on the design of learning. Learning and Instruction, 12(1), 1-10. https://doi.org/10.1016/S0959-4752(01)00014-7
  • Kuder, G. F. & Richardson, M. W. (1937). The theory of the estimation of test reliability. Psychometrika, 2, 151–160. https://doi.org/10.1007/BF02288391
  • Leary, L. F., & Dorans, N. J. (1985). Implications for altering the context in which test items appear: A historical perspective on an immediate concern. Review of Educational Research, 55(3), 387-413. https://doi.org/10.3102/00346543055003387
  • Linn, R. L., & Gronlund, N. E. (1995). Measuring and assessment in teaching (7th ed.). Ohio: Prentice Hall.
  • Lowe, D. (1991). Set a multiple choice question (MCQ) examination. British Medical Journal, 302, 780-782. https://doi.org/10.1136/bmj.302.6779.780
  • McKeachie, W. J., Pollie, D., & Speisman, J. (1955). Relieving anxiety in classroom examinations. The Journal of Abnormal and Social Psychology, 50(1), 93–98. https://doi.org/10.1037/h0046560
  • Norman, R. D. (1954). The effects of a forward retention set on an objective achievement test presented forwards or backwards. Journal of Educational & Psychological Measurement, 14, 487–498. https://doi.org/10.1177/001316445401400305
  • Opara, I. M. (2021). Test construction and measurement, concepys and applications. Reliable Publishers.
  • Opara, I. M., & Uwah, I. V. (2017). Effect of test item arrangement on performance in mathematics among junior secondary school students in obio/akpor local government area of rivers state Nigeria. British Journal of Education, 5(8), 1-9. https://eajournals.org/bje/vol-5-issue-8-july-2017-special-issue/
  • Papenberg, M., Diedenhofen, B., & Musch, J. (2021). An experimental validation of sequential multiple-choice tests. The Journal of Experimental Education, 89(2), 402–421. https://doi.org/10.1080/00220973.2019.1671299
  • Plake, B. S., Ansorge, C. J., Parker, C. S., & Lowry, S. R. (1982). Effects of item arrangement, knowledge of arrangement test anxiety and sex on test performance. Journal of Educational Measurement, 19(1), 49–57. http://www.jstor.org/stable/1434918
  • Paretta, R. L., & Chadwick, L. W. (1975). The sequencing of examination questions and its effects on student performance. The Accounting Review, 50(3), 595-601. https://www.jstor.org/stable/245020
  • Peek, G. S. (1994). Using test-bank software for randomized test-item sequencing in managerial accounting. Journal of Education for Business, 70(2), 77–81. https://doi.org/10.1080/08832323.1994.10117728
  • Pettijohn,Terry F.,,II, & Sacco, M. F. (2007). Multiple-choice exam question order influences on student performance, completion time, and perceptions. Journal of Instructional Psychology, 34(3), 142-149. https://www.proquest.com/scholarly-journals/multiple-choice-exam-question-order-influences-on/docview/213904129/se-2
  • Roediger, H. L. III, & Marsh, E. J. (2005). The positive and negative consequences of multiple-choice testing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31(5), 1155–1159. https://doi.org/10.1037/0278-7393.31.5.1155
  • Russell, M., Fischer, M. J., Fischer, C. M., & Premo, K. (2003). Exam question sequencing effects on marketing and management sciences student performance. Journal for Advancement of Marketing Education, 3, 1–10. https://www.asbbs.org/files/marketing.pdf#page=168
  • Schimit, J. C. & Sheirer, C. J. (1977). The effect of item order on objective tests. Teaching of Psychology, 4(3), 144-153. https://doi.org/10.1207/s15328023top0403_11
  • Schuwirth L. W. T. & van der Vleuten C. P. M. (2004). Different written assessment methods: What can be said about their strengths and weaknesses? Medical Education, 38(9), 974-979. https://doi.org/10.1111/j.1365-2929.2004.01916.x
  • Siddiqui, A. A., Zain Ul Abideen, M., Fatima, S., Talal Khan, M., Gillani, S. W., Alrefai, Z. A., Waqar Hussain, M., & Rathore, H. A. (2024). Students’ perception of online versus face-to-face learning: What do the healthcare teachers have to know? Cureus, 16(2), e54217. https://doi.org/10.7759/cureus.54217
  • Skinner, N. F. (1999). When the going get tough, the tough get going: Effects of item difficulty on multiple-choice test performance. North American Journal of Psychology, 7(1), 79-82. https://files.eric.ed.gov/fulltext/ED449388.pdf#page=83
  • Smith, S. M. (1979). Remembering in and out of context. Journal of Experimental Psychology: Human Learning and Memory, 5, 460–471. https://doi.org/10.1037/0278-7393.5.5.460
  • Stanley, J. C. (1961) Studying status vs. manipulating variables. Educational and Psychological Measurement, 21(4), 793-795. https://doi.org/10.1177/001316446102100
  • Stout, D. E., & Heck, J. L. (1995). Empirical findings regarding student exam performance and question sequencing: The case of the cumulative final. Journal of Financial Education, 21, 29-35. https://www.jstor.org/stable/41948181
  • Sue, D. L. (2009). The effect of scrambling test questions on student performance in a small class setting. Journal for Economic Educators, 9(1), 32-41. https://libjournals.mtsu.edu/index.php/jfee/article/view/1454 Surahman, E., & Wang, T. H. (2022). Academic dishonesty and trustworthy assessment in online learning: A systematic literature review. Journal of Computer Assisted Learning, 38(6), 1535-1553. https://doi.org/10.1111/jcal.12708
  • Sweller, J. (2011). Cognitive load theory. In Psychology of Learning and Motivation (Vol. 55, pp. 37-76). Academic Press. https://doi.org/10.1016/B978-0-12-387691-1.00002-8
  • Şad, S. N. (2020). Does difficulty-based item order matter in multiple-choice exams?(Empirical evidence from university students). Studies in Educational Evaluation, 64, 100812. https://doi.org/10.1016/j.stueduc.2019.100812
  • Tamir, P. (1991). Multiple choice items: How to gain the most out of them. Biochemical Education, 19(4), 188–192. https://doi.org/10.1016/0307-4412(91)90094-O
  • Thissen, D. (2017). Reliability and measurement precision. In H. Wainer (Ed.). Computerized and adaptive testing: A primer (pp. 161-185). Lawrence Erlbaum.
  • Vander Schee, B. A. (2009) Test item order, academic achievement and student performance on principles of marketing examinations. Journal for Advancement of Marketing Education, 14(1), 23-30. https://www.proquest.com/openview/508fdf8b2223b6d77c9bc85f634dd0fc/1?pq-origsite=gscholar&cbl=5256660
  • Vander Schee, B. A. (2013). Test item order, level of difficulty, and student performance in marketing education. Journal of Education for Business, 88(1), 36-42. https://doi.org/10.1080/08832323.2011.633581
  • Weinstein, Y., & Roediger, H. L. (2012). The effect of question order on evaluations of test performance: How does the bias evolve?. Memory & Cognition, 40, 727-735. https://doi.org/10.3758/s13421-012-0187-3
  • YÖKA1 (2018). Principles of Ataturk and History of Revolution I, Ordu Universty Course Catalog/Information Package. Retrieved June 6, 2024, from https://bologna.odu.edu.tr/DereceProgramlari/Ders/0/237/43481/41348/1?lang=en-US
  • Zeidner, M. (1998). Test anxiety-the state of art. USA: Plenum Press.

Çoktan seçmeli testlerde maddelerin rastgele sıralanmasının çevrimiçi öğrenenlerin akademik başarıları üzerindeki etkisi

Year 2025, Volume: 15 Issue: 2, 1254 - 1266, 29.06.2025
https://doi.org/10.48146/odusobiad.1599977

Abstract

Bu çalışmada, rastgele sıralanmış çoktan seçmeli test formlarının öğrencilerin akademik başarılarına etkisi incelenmiştir. Çalışma gerçek deneyselde gerçekleştirilmiştir. Çalışmaya katılan tüm öğrenciler eğitimlerini çevrimiçi öğrenme yoluyla almıştır. Araştırmanın çalışma grubunu, 2023-2024 eğitim-öğretim yılı güz yarıyılında bir devlet üniversitesinin farklı fakültelerinde öğrenim gören 2932 birinci sınıf üniversite öğrencisi oluşturmaktadır. Öğrencinin akademik başarısını ölçmek için 20 soruluk çoktan seçmeli test kullanılmıştır. Maddeler rastgele sıralanarak dört farklı test formu oluşturulmuştur. Farklı test formlarını dolduran öğrencilerin ortalama puanları arasında anlamlı bir fark olup olmadığını test etmek için One-way ANOVA kullanılmıştır. Ayrıca farklı test formlarının test istatistikleri incelenmiştir. Bulgular, öğrencilerin farklı test formlarındaki ortalama puanları arasında anlamlı bir fark olmadığını (1) ve puan dağılımlarının dengeli olduğunu (2) göstermiştir. Bu bulgular, rastgele sıralamayla oluşturulan farklı test formlarının eşdeğer olduğunu göstermektedir. Farklı formlar ait test istatistikleri de bu bulguları desteklemektedir. Bu çalışmanın sonucunda, rastgele sıralama yoluyla oluşturulan farklı test formlarının öğrencilerin akademik başarılarını etkilemediği ortaya çıkmıştır. Bu çalışma, eğitimcilere, test geliştiricilere ve yöneticilere, çoktan seçmeli testlerde güvenli bir değerlendirme sağlamak için rastgele sıralama yoluyla farklı test formları oluşturmalarını önermektedir.

References

  • Aamodt, M. G., & McShane, T. (1992). A meta-analytic investigation of the effect of various test item characteristics on test scores and test completion times. Public Personnel Management, 21(2), 151-160. https://doi.org/10.1177/009102609202100203
  • Baffoe, J., Asamoah, D., Shahrill, M., Latif, S. N. A., Asamoah Gyimah, K., & Anane, E. (2024, April). Does the sequence of items influence secondary school students’ performance in mathematics and science?. In AIP Conference Proceedings (Vol. 3052, No. 1). AIP Publishing. https://doi.org/10.1063/5.0202870
  • Baghaei, P., & Amrahi, N. (2011). The effects of the number of options on the psychometric characteristics of multiple choice items. Psychological Test and Assessment Modeling, 53(2), 192-211. https://www.psychologie-aktuell.com/fileadmin/download/ptam/2-2011_20110622/02_Baghaei.pdf
  • Balch, W. R. (1989). Item order affects performance on multiplechoice exams. Teaching of Psychology, 16(2), 75–77. https://doi.org/10.1207/s15328023top1602_9
  • Baldwin, B. A., & Howard, T. P. (1983). Intertopical sequencing of examination questions: An evaluation. Journal of Accounting Education, 1(1), 89–95. https://doi.org/10.1016/0748-5751(83)90010-6
  • Bard, G., & Weinstein, Y. (2017). The effect of question order on evaluations of test performance: Can the bias dissolve? Quarterly Journal of Experimental Psychology, 70(10), 2130-2140. https://doi.org/10.1080/17470218.2016.1225108
  • Başol, G. (2018). Measurement and evaluation in education. Pegem Akademi. https://doi.org/10.14527/9786053645887
  • Biesta, G. (2009). Good education in an age of measurement: On the need to reconnect with the question of purpose in education. Educational Assessment, Evaluation and Accountability, 21, 33-46. https://doi.org/10.1007/s11092-008-9064-9
  • Borsboom, D., & Molenaar, D. (2015). Psychometrics. In James D. Wright (Ed.), International Encyclopedia of the Social & Behavioral Sciences (Second Edition, pp. 418-422). Elsevier Ltd. https://doi.org/10.1016/B978-0-08-097086-8.43079-5
  • Butler, A. C. (2018). Multiple-choice testing in education: Are the best practices for assessment also good for learning?. Journal of Applied Research in Memory and Cognition, 7(3), 323-331. https://doi.org/10.1016/j.jarmac.2018.07.002
  • Canlar, M., & Jackson, W. K. (1991). Alternative test question sequencing in introductory financial accounting. Journal of Education for Business, 67(2), 116-119. https://doi.org/10.1080/08832323.1991.10117529
  • Carlson, J. L., & Ostrosky, A. L. (1992). Item sequence and student performance on multiple-choice exams: Further evidence. The Journal of Economic Education, 23(3), 232–235. https://doi.org/10.1080/00220485.1992.10844757
  • Carnegie, J. A. (2017). Does correct answer distribution influence student choices when writing multiple choice examinations? Canadian Journal for the Scholarship of Teaching and Learning, 8(1), 11. http://ir.lib.uwo.ca/cjsotl_rcacea/vol8/iss1/1
  • Cook, D. A., & Beckman, T. J. (2006). Current concepts in validity and reliability for psychometric instruments: Theory and application. The American Journal of Medicine, 119(2), 166-e7. https://doi.org/10.1016/j.amjmed.2005.10.036
  • Cronbach, L. J. (1950). Further evidence on response sets and test design. Educational and Psychological Measurement, 10(1), 3-31. https://doi.org/10.1177/00131644500100010
  • Davis, D. B. (2017). Exam question sequencing effects and context cues. Teaching of Psychology, 44(3), 263-267. https://doi.org/10.1177/009862831771275
  • Doğan Gül, Ç., & Çokluk Bökeoğlu, Ö. (2018). The comparison of academic success of students with low and high anxiety levels in tests varying in item difficulty. Inonu University Journal of the Faculty of Education, 19(3), 252-265. https://doi.org/10.17679/inuefd.341477
  • Ertmer, P. A., & Newby, T. J. (2013). Behaviorism, cognitivism, constructivism: Comparing critical features from an instructional design perspective. Performance Improvement Quarterly, 26(2), 43-71. https://doi.org/10.1002/piq.21143
  • Fraenkel, J. R., & Wallen, N. E. (2012). How to design and evaluate research in education (7th ed.). McGraw-Hill.
  • Frederiksen, J. R., White, B. Y., & Gutwill, J. (1999). Dynamic mental models in learning science: The importance of constructing derivational linkages among models. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 36(7), 806-836. https://doi.org/10.1002/(SICI)1098-2736(199909)36:7<806::AID-TEA5>3.0.CO;2-2
  • Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers and Education, 57(4), 2333–2351. https://doi.org/10.1016/j.compedu.2011.06.004
  • Good, V. C. (1973). Dictionary of education. N.Y. : McGraw Hill Book Company
  • Gruber, R. A. (1987). Sequencing exam questions relative to topic presentation. Journal of Accounting Education, 5, 77–86. https://doi.org/10.1016/0748-5751(87)90039-X
  • Green, B. F. (1981). A primer of testing. American Psychologist, 36(10), 1001-1011. https://doi.org/10.1037/0003-066X.36.10.1001
  • Gyamfi, A. (2022). Controlling examination malpractice in Senior High Schools in Ghana through performance-based assessment. Journal of Advances in Education and Philosophy, 6(3), 203-211. https://doi.org/10.36348/jaep.2022.v06i04.002
  • Gyamfi, A., Acquaye, R., & Adjei, C. (2023). Multiple-Choice Items should be sequenced in order of difficulty with the easiest ones placed first. Does it really affect performance? Research Square. https://doi.org/10.21203/rs.3.rs-2882983/v1
  • Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2013). Multivariate data analysis. Pearson Education Limited.
  • Hambleton, R. K., & Traub, R. E. (1974). The effects of item order on test performance and stress. The Journal of Experimental Education, 43(1), 40-46. https://doi.org/10.1080/00220973.1974.10806302
  • Hauck, K. B., Mingo, M. A., & Williams, R. L. (2017). A review of relationships between item sequence and performance on multiple-choice exams. Scholarship of Teaching and Learning in Psychology, 3(1), 58–75. https://doi.org/10.1037/stl0000077
  • Hodson, D. (1984). Some effects of changes in question structure and sequence on performance in a multiple choice chemistry test. Research in Science & Technological Education, 2(2), 177–185. https://doi.org/10.1080/0263514840020209
  • Howard, J. M., & Scott, A. (2017). Any time, any place, flexible pace: Technology-enhanced language learning in a teacher education programme. Australian Journal of Teacher Education (Online), 42(6), 51-68. https://doi.org/10.14221/ajte.2017v42n6.4
  • Kalaycı, Ş. (2008). Spss uygulamalı çok değişkenli istatistik teknikleri. Ankara: Asil Yayın Dağıtım.
  • Kaplan, R. M., & Saccuzzo, D. P. (2001). Psychological testing: Principles, applications, and issues (5th ed.). Wadsworth/Thomson Learning.
  • Kirschner, P. A. (2002). Cognitive load theory: Implications of cognitive load theory on the design of learning. Learning and Instruction, 12(1), 1-10. https://doi.org/10.1016/S0959-4752(01)00014-7
  • Kuder, G. F. & Richardson, M. W. (1937). The theory of the estimation of test reliability. Psychometrika, 2, 151–160. https://doi.org/10.1007/BF02288391
  • Leary, L. F., & Dorans, N. J. (1985). Implications for altering the context in which test items appear: A historical perspective on an immediate concern. Review of Educational Research, 55(3), 387-413. https://doi.org/10.3102/00346543055003387
  • Linn, R. L., & Gronlund, N. E. (1995). Measuring and assessment in teaching (7th ed.). Ohio: Prentice Hall.
  • Lowe, D. (1991). Set a multiple choice question (MCQ) examination. British Medical Journal, 302, 780-782. https://doi.org/10.1136/bmj.302.6779.780
  • McKeachie, W. J., Pollie, D., & Speisman, J. (1955). Relieving anxiety in classroom examinations. The Journal of Abnormal and Social Psychology, 50(1), 93–98. https://doi.org/10.1037/h0046560
  • Norman, R. D. (1954). The effects of a forward retention set on an objective achievement test presented forwards or backwards. Journal of Educational & Psychological Measurement, 14, 487–498. https://doi.org/10.1177/001316445401400305
  • Opara, I. M. (2021). Test construction and measurement, concepys and applications. Reliable Publishers.
  • Opara, I. M., & Uwah, I. V. (2017). Effect of test item arrangement on performance in mathematics among junior secondary school students in obio/akpor local government area of rivers state Nigeria. British Journal of Education, 5(8), 1-9. https://eajournals.org/bje/vol-5-issue-8-july-2017-special-issue/
  • Papenberg, M., Diedenhofen, B., & Musch, J. (2021). An experimental validation of sequential multiple-choice tests. The Journal of Experimental Education, 89(2), 402–421. https://doi.org/10.1080/00220973.2019.1671299
  • Plake, B. S., Ansorge, C. J., Parker, C. S., & Lowry, S. R. (1982). Effects of item arrangement, knowledge of arrangement test anxiety and sex on test performance. Journal of Educational Measurement, 19(1), 49–57. http://www.jstor.org/stable/1434918
  • Paretta, R. L., & Chadwick, L. W. (1975). The sequencing of examination questions and its effects on student performance. The Accounting Review, 50(3), 595-601. https://www.jstor.org/stable/245020
  • Peek, G. S. (1994). Using test-bank software for randomized test-item sequencing in managerial accounting. Journal of Education for Business, 70(2), 77–81. https://doi.org/10.1080/08832323.1994.10117728
  • Pettijohn,Terry F.,,II, & Sacco, M. F. (2007). Multiple-choice exam question order influences on student performance, completion time, and perceptions. Journal of Instructional Psychology, 34(3), 142-149. https://www.proquest.com/scholarly-journals/multiple-choice-exam-question-order-influences-on/docview/213904129/se-2
  • Roediger, H. L. III, & Marsh, E. J. (2005). The positive and negative consequences of multiple-choice testing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31(5), 1155–1159. https://doi.org/10.1037/0278-7393.31.5.1155
  • Russell, M., Fischer, M. J., Fischer, C. M., & Premo, K. (2003). Exam question sequencing effects on marketing and management sciences student performance. Journal for Advancement of Marketing Education, 3, 1–10. https://www.asbbs.org/files/marketing.pdf#page=168
  • Schimit, J. C. & Sheirer, C. J. (1977). The effect of item order on objective tests. Teaching of Psychology, 4(3), 144-153. https://doi.org/10.1207/s15328023top0403_11
  • Schuwirth L. W. T. & van der Vleuten C. P. M. (2004). Different written assessment methods: What can be said about their strengths and weaknesses? Medical Education, 38(9), 974-979. https://doi.org/10.1111/j.1365-2929.2004.01916.x
  • Siddiqui, A. A., Zain Ul Abideen, M., Fatima, S., Talal Khan, M., Gillani, S. W., Alrefai, Z. A., Waqar Hussain, M., & Rathore, H. A. (2024). Students’ perception of online versus face-to-face learning: What do the healthcare teachers have to know? Cureus, 16(2), e54217. https://doi.org/10.7759/cureus.54217
  • Skinner, N. F. (1999). When the going get tough, the tough get going: Effects of item difficulty on multiple-choice test performance. North American Journal of Psychology, 7(1), 79-82. https://files.eric.ed.gov/fulltext/ED449388.pdf#page=83
  • Smith, S. M. (1979). Remembering in and out of context. Journal of Experimental Psychology: Human Learning and Memory, 5, 460–471. https://doi.org/10.1037/0278-7393.5.5.460
  • Stanley, J. C. (1961) Studying status vs. manipulating variables. Educational and Psychological Measurement, 21(4), 793-795. https://doi.org/10.1177/001316446102100
  • Stout, D. E., & Heck, J. L. (1995). Empirical findings regarding student exam performance and question sequencing: The case of the cumulative final. Journal of Financial Education, 21, 29-35. https://www.jstor.org/stable/41948181
  • Sue, D. L. (2009). The effect of scrambling test questions on student performance in a small class setting. Journal for Economic Educators, 9(1), 32-41. https://libjournals.mtsu.edu/index.php/jfee/article/view/1454 Surahman, E., & Wang, T. H. (2022). Academic dishonesty and trustworthy assessment in online learning: A systematic literature review. Journal of Computer Assisted Learning, 38(6), 1535-1553. https://doi.org/10.1111/jcal.12708
  • Sweller, J. (2011). Cognitive load theory. In Psychology of Learning and Motivation (Vol. 55, pp. 37-76). Academic Press. https://doi.org/10.1016/B978-0-12-387691-1.00002-8
  • Şad, S. N. (2020). Does difficulty-based item order matter in multiple-choice exams?(Empirical evidence from university students). Studies in Educational Evaluation, 64, 100812. https://doi.org/10.1016/j.stueduc.2019.100812
  • Tamir, P. (1991). Multiple choice items: How to gain the most out of them. Biochemical Education, 19(4), 188–192. https://doi.org/10.1016/0307-4412(91)90094-O
  • Thissen, D. (2017). Reliability and measurement precision. In H. Wainer (Ed.). Computerized and adaptive testing: A primer (pp. 161-185). Lawrence Erlbaum.
  • Vander Schee, B. A. (2009) Test item order, academic achievement and student performance on principles of marketing examinations. Journal for Advancement of Marketing Education, 14(1), 23-30. https://www.proquest.com/openview/508fdf8b2223b6d77c9bc85f634dd0fc/1?pq-origsite=gscholar&cbl=5256660
  • Vander Schee, B. A. (2013). Test item order, level of difficulty, and student performance in marketing education. Journal of Education for Business, 88(1), 36-42. https://doi.org/10.1080/08832323.2011.633581
  • Weinstein, Y., & Roediger, H. L. (2012). The effect of question order on evaluations of test performance: How does the bias evolve?. Memory & Cognition, 40, 727-735. https://doi.org/10.3758/s13421-012-0187-3
  • YÖKA1 (2018). Principles of Ataturk and History of Revolution I, Ordu Universty Course Catalog/Information Package. Retrieved June 6, 2024, from https://bologna.odu.edu.tr/DereceProgramlari/Ders/0/237/43481/41348/1?lang=en-US
  • Zeidner, M. (1998). Test anxiety-the state of art. USA: Plenum Press.
There are 66 citations in total.

Details

Primary Language English
Subjects Measurement and Evaluation in Education (Other)
Journal Section Research Article
Authors

Necati Taşkın 0000-0001-8519-6185

Bülent Kandemir 0000-0002-2852-547X

Publication Date June 29, 2025
Submission Date December 11, 2024
Acceptance Date March 27, 2025
Published in Issue Year 2025 Volume: 15 Issue: 2

Cite

APA Taşkın, N., & Kandemir, B. (2025). Effect of random item ordering in multiple choice tests on the academic achievement of online learners. Ordu Üniversitesi Sosyal Bilimler Enstitüsü Sosyal Bilimler Araştırmaları Dergisi, 15(2), 1254-1266. https://doi.org/10.48146/odusobiad.1599977

With the wish to be enlightened by the light of knowledge…
ODÜSOBİAD