Review
BibTex RIS Cite

Beyond self-reports: Addressing bias and improving data quality in educational research

Year 2025, Volume: 16 Issue: 2, 115 - 123, 30.06.2025
https://doi.org/10.21031/epod.1630477

Abstract

The use of self-report in educational research has facilitated data collection by providing easy access to diverse populations in a short period of time. However, research has shown that these opportunities come with challenges. From inadequate response efforts to culturally influenced responses, there are numerous failures that can call into question the validity of our findings. For example, discrepancies between self-reports and objective data often reveal underlying biases. Poor data quality, exacerbated by social desirability in sensitive constructs, individual and environmental factors, and changes in scale structure, has highlighted that our methods may have some limitations that reduce generalisability and trigger the replicability crisis. However, these limitations can also lead to improvements in both survey design and data interpretation. Our experiences point to the need to integrate multiple data sources, to improve survey development and adaptation methods, and to use true experimental studies more frequently. By reflecting on these challenges, we suggest new directions for survey implementation in educational research studies to improve the reliability and replicability of our findings and deepen our understanding of complex human dynamics.

Ethical Statement

The author has followed the rules of good scientific practice and there are no conflicts of interest. All procedures in the study conformed to the ethical standards of the Helsinki Declaration of 1964 and its subsequent amendments.

Supporting Institution

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Thanks

The author would like to thank the editor(s) and reviewers for their valuable contribution to the refinement of the manuscript.

References

  • Akbulut, Y. (2015). Predictors of inconsistent responding in web surveys. Internet Research, 25(1), 131-147. https://doi.org/10.1108/IntR-01-2014-0017
  • Akbulut, Y. (2024). Impact of different practice testing methods on learning outcomes. European Journal of Education, 59 (2), e12626. https://doi.org/10.1111/ejed.12626
  • Akbulut, Y., & Dönmez, O. (2018). Predictors of digital piracy among Turkish undergraduate students. Telematics and Informatics, 35(5), 1324-1334. https://doi.org/10.1016/j.tele.2018.03.004
  • Akbulut, Y., Saykılı, A., Öztürk, A., & Bozkurt, A. (2023). What if it's all an illusion? To what extent can we rely on self-reported data in open, online, and distance education systems? International Review of Research in Open and Distributed Learning, 24 (3), 1-17. https://doi.org/10.19173/irrodl.v24i3.7321
  • Akbulut, Y., Uysal, Ö., Odabasi, H. F., & Kuzu, A. (2008). Influence of gender, program of study and PC experience on unethical computer using behaviors of Turkish undergraduate students. Computers & Education, 51(2), 485-492. https://doi.org/10.1016/j.compedu.2007.06.004
  • Alemdag, E., & Cagiltay, K. (2018). A systematic review of eye tracking research on multimedia learning. Computers & Education, 125, 413-428. https://doi.org/10.1016/j.compedu.2018.06.023
  • Andrade, C. (2021). HARKing, cherry-picking, p-hacking, fishing expeditions, and data dredging and mining as questionable research practices. The Journal of Clinical Psychiatry, 82(1), 25941. https://doi.org/10.4088/JCP.20f13804
  • Andrejevic, M., & Selwyn, N. (2020). Facial recognition technology in schools: Critical questions and concerns. Learning, Media and Technology, 45(2), 115-128. https://doi.org/10.1080/17439884.2020.1686014
  • Boateng, G. O., Neilands, T. B., Frongillo, E. A., Melgar-Quiñonez, H. R., & Young, S. L. (2018). Best practices for developing and validating scales for health, social, and behavioral research: a primer. Frontiers in Public Health, 6, 149. https://doi.org/10.3389/fpubh.2018.00149
  • Booth, C., Wong, E., Brown, M., & Fitzsimons, E. (2024). Evaluating the effect of monetary incentives on web survey response rates in the UK Millennium Cohort Study. Survey Research Methods, 18(1), 47-58. https://doi.org/10.18148/srm/2024.v18i1.8210
  • Chang, M., Büchel, D., Reinecke, K., Lehmann, T., & Baumeister, J. (2022). Ecological validity in exercise neuroscience research: A systematic investigation. European Journal of Neuroscience, 55(2), 487-509. https://doi.org/10.1111/ejn.15595
  • Dalton, D., & Ortegren, M. (2011). Gender differences in ethics research: The importance of controlling for the social desirability response bias. Journal of Business Ethics, 103, 73-93. https://doi.org/10.1007/s10551-011-0843-8
  • Danziger, S., Levav, J., & Avnaim-Pesso, L. (2011). Extraneous factors in judicial decisions. Proceedings of the National Academy of Sciences, 108(17), 6889-6892. https://doi.org/10.1073/pnas.1018033108
  • Demir, E. K. (2023). Investigating the effect of item order on the psychometric properties of a self-efficacy perception scale. E-International Journal of Educational Research, 14(5). https://doi.org/10.19160/e-ijer.1362442
  • Demirkol, S., & Kelecioglu, H. (2022). Investigating the effect of item position on person and item parameters: PISA 2015 Turkey sample. Journal of Measurement and Evaluation in Education and Psychology, 13(1), 69-85. https://doi.org/10.21031/epod.958576
  • DeSimone, J. A., Harms, P. D., & DeSimone, A. J. (2015). Best practice recommendations for data screening. Journal of Organizational Behavior, 36 (2), 171–181. https://doi.org/10.1002/job.1962
  • DeSimone, J. A., & Harms, P. D. (2018). Dirty data: The effects of screening respondents who provide low-quality data in survey research. Journal of Business and Psychology, 33, 559-577. https://doi.org/10.1007/s10869-017-9514-9
  • Dönmez, O., & Akbulut, Y. (2016). Social desirability bias in cyberbullying research. Educational Technology: Theory and Practice, 6(2), 1-18. https://doi.org/10.17943/etku.19033
  • Dönmez, O., Akbulut, Y., Zabzun, G., & Köseoğlu, B. (2025). Effects of survey order on subjective measures of cognitive load: A randomized controlled trial. Applied Cognitive Psychology, 39(2), e70039. https://doi.org/10.1002/acp.70039
  • Erdem-Kara, B., & Akbulut, Y. (2025). Prevalence and psychometric implications of careless responses in an online student survey. Journal of Psychoeducational Assessment, 07342829251328132. https://doi.org/10.1177/07342829251328132
  • Goodman, J., Anson, R., & Belcheir, M. (2015). The effect of incentives and other instructor-driven strategies to increase online student evaluation response rates. Assessment & Evaluation in Higher Education,  40(7), 958-970. https://doi.org/10.1080/02602938.2014.960364
  • Grieve, R., & Elliott, J. (2013). Cyberfaking: I can, so I will? Intentions to fake in online psychological testing. Cyberpsychology, Behavior, and Social Networking, 16(5), 364–369. https://doi.org/10.1089/cyber.2012.0271
  • Hambleton, R. K., & Traub, R. E. (1974). The effects of item order on test performance and stress. The Journal of Experimental Education, 43(1), 40-46. https://doi.org/10.1080/00220973.1974.10806302
  • Heggestad, E. D., Scheaf, D. J., Banks, G. C., Monroe Hausfeld, M., Tonidandel, S., & Williams, E. B. (2019). Scale adaptation in organizational science research: A review and best-practice recommendations. Journal of Management, 45(6), 2596-2627. https://doi.org/10.1177/0149206319850280
  • Huang, F. L., & Cornell, D. G. (2016). Question order affects the measurement of bullying victimization among middle school students. Educational and Psychological Measurement, 76(5), 724–740. https://doi.org/10.1177/0013164415622664
  • Iaconelli, R., & Wolters, C. A. (2020). Insufficient effort responding in surveys assessing self-regulated learning: Nuisance or fatal flaw?. Frontline Learning Research, 8(3), 104-125. https://doi.org/10.14786/flr.v8i3.521
  • King, M. F., & Bruner, G. C. (2000). Social desirability bias: A neglected aspect of validity testing. Psychology and Marketing, 17(2), 79-103. https://doi.org/10.1002/(SICI)1520-6793(200002)17:2<79::AID-MAR2>3.0.CO;2-0
  • Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5(3), 213-36. https://doi.org/10.1002/acp.2350050305
  • Krosnick, J. A., & Alwin, D. F. (1987). An evaluation of a cognitive theory of response-order effects in survey measurement. Public Opinion Quarterly, 51(2), 201-219. https://doi.org/10.1086/269029
  • Newman, D. L., Kundert, D. K., Lane Jr, D. S., & Bull, K. S. (1988). Effect of varying item order on multiple-choice test scores: Importance of statistical and cognitive difficulty. Applied Measurement in Education, 1(1), 89-97. https://doi.org/10.1207/s15324818ame0101_8
  • Örün, Ö., & Akbulut, Y. (2019). Effect of multitasking, physical environment and electroencephalography use on cognitive load and retention. Computers in Human Behavior, 92, 216-229. https://doi.org/10.1016/j.chb.2018.11.027
  • Rasinski, K. A., Lee, L., & Krishnamurty, P. (2012). Question order effects. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology, Vol. 1. Foundations, planning, measures, and psychometrics (pp. 229–248). American Psychological Association. https://doi.org/10.1037/13619-014
  • Roos, A. L., Goetz, T., Voracek, M., Krannich, M., Bieg, M., Jarrell, A., & Pekrun, R. (2021). Test anxiety and physiological arousal: A systematic review and meta-analysis. Educational Psychology Review, 33, 579-618. https://doi.org/10.1007/s10648-020-09543-z
  • Siddiq, F., & Scherer, R. (2019). Is there a gender gap? A meta-analysis of the gender differences in students' ICT literacy. Educational Research Review, 27, 205-217. https://doi.org/10.1016/j.edurev.2019.03.007
  • Skulmowski, A. (2023). Learners emphasize their intrinsic load if asked about it first: communicative aspects of cognitive load measurement. Mind, Brain, and Education, 17(3), 161-218. https://doi.org/10.1111/mbe.12369
  • Stark, T. H., Silber, H., Krosnick, J. A., Blom, A. G., Aoyagi, M., Belchior, A., Bosnjak, M., Clement, S. L., John, M., Jónsdóttir, G.A., Lawson, K., Lynn, P., Martinsson, J., Shamshiri-Petersen, D., Tvinnereim, E., & Yu, R. R. (2020). Generalization of classic question order effects across cultures. Sociological Methods & Research, 49(3), 567-602. https://doi.org/10.1177/0049124117747304
  • Statista. (2023, September 29). Turkey: most popular sports 2013. https://www.statista.com/statistics/677503/most-followed-sports-turkey/
  • Stefkovics, Á., & Kmetty, Z. (2022). A comparison of question order effects on item-by-item and grid formats: visual layout matters. Measurement Instruments for the Social Sciences, 4(8) 1-12. https://doi.org/10.1186/s42409-022-00036-z
  • Steger, D., Jankowsky, K., Schroeders, U., & Wilhelm, O. (2023). The road to hell is paved with good intentions: How common practices in scale construction hurt validity. Assessment, 30(6), 1811-1824. https://doi.org/10.1177/10731911221124846
  • Şahin, M. D. (2021). Effect of item order on certain psychometric properties: A demonstration on a cyberloafing scale. Frontiers in Psychology, 12, 590545.https://doi.org/10.3389/fpsyg.2021.590545
  • Thorndike, F. P., Carlbring, P., Smyth, F. L., Magee, J. C., Gonder-Frederick, L., Ost, L. G., & Ritterband, L. M. (2009). Web-based measurement: Effect of completing single or multiple items per webpage. Computers in Human Behavior, 25(2), 393-401. https://doi.org/10.1016/j.chb.2008.05.006
  • Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1131. https://doi.org/10.1126/science.185.4157.1124
  • Weinberg, M. K., Seton, C., & Cameron, N. (2018). The measurement of subjective wellbeing: Item-order effects in the personal wellbeing index—adult. Journal of Happiness Studies, 19 (1), 315-332. https://doi.org/10.1007/s10902-016-9822-1.
  • Wise, S. L., Kuhfeld, M. R., & Lindner, M. A. (2024). Don’t test after lunch: The relationship between disengagement and the time of day that low-stakes testing occurs. Applied Measurement in Education, 37(1), 14–28. https://doi.org/10.1080/08957347.2024.2311925
  • Yedukondalu, J., Sunkara, K., Radhika, V., Kondaveeti, S., Anumothu, M., & Murali Krishna, Y. (2025). Cognitive load detection through EEG lead wise feature optimization and ensemble classification. Scientific Reports, 15(1), 842. https://doi.org/10.1038/s41598-024-84429-6
  • Yörük-Açıkel, B., Turhan, U., & Akbulut, Y. (2018). Effect of multitasking on simulator sickness and performance in 3D aerodrome control training. Simulation & Gaming, 49 (1), 27-49. https://doi.org/10.1177/1046878117750417
  • Xue, Y., Wang, K., & Qiu, Y. (2024). Enhancing online learning: A multimodal approach for cognitive load assessment. International Journal of Human-Computer Interaction. Advance online publication. https://doi.org/10.1080/10447318.2024.2327198
  • Zhou, M., & Winne, P.H. (2012). Modeling academic achievement by self-reported versus traced goal orientation. Learning and Instruction, 22(6), 413-419. https://doi.org/10.1016/j.learninstruc.2012.03.004
  • Zhu, W., Wang, F., Mayer, R. E., & Liu, T. (2024). Effects of explaining a science lesson to others or to oneself: A cognitive neuroscience approach. Learning and Instruction, 91, 101897. https://doi.org/10.1016/j.learninstruc.2024.101897

Year 2025, Volume: 16 Issue: 2, 115 - 123, 30.06.2025
https://doi.org/10.21031/epod.1630477

Abstract

References

  • Akbulut, Y. (2015). Predictors of inconsistent responding in web surveys. Internet Research, 25(1), 131-147. https://doi.org/10.1108/IntR-01-2014-0017
  • Akbulut, Y. (2024). Impact of different practice testing methods on learning outcomes. European Journal of Education, 59 (2), e12626. https://doi.org/10.1111/ejed.12626
  • Akbulut, Y., & Dönmez, O. (2018). Predictors of digital piracy among Turkish undergraduate students. Telematics and Informatics, 35(5), 1324-1334. https://doi.org/10.1016/j.tele.2018.03.004
  • Akbulut, Y., Saykılı, A., Öztürk, A., & Bozkurt, A. (2023). What if it's all an illusion? To what extent can we rely on self-reported data in open, online, and distance education systems? International Review of Research in Open and Distributed Learning, 24 (3), 1-17. https://doi.org/10.19173/irrodl.v24i3.7321
  • Akbulut, Y., Uysal, Ö., Odabasi, H. F., & Kuzu, A. (2008). Influence of gender, program of study and PC experience on unethical computer using behaviors of Turkish undergraduate students. Computers & Education, 51(2), 485-492. https://doi.org/10.1016/j.compedu.2007.06.004
  • Alemdag, E., & Cagiltay, K. (2018). A systematic review of eye tracking research on multimedia learning. Computers & Education, 125, 413-428. https://doi.org/10.1016/j.compedu.2018.06.023
  • Andrade, C. (2021). HARKing, cherry-picking, p-hacking, fishing expeditions, and data dredging and mining as questionable research practices. The Journal of Clinical Psychiatry, 82(1), 25941. https://doi.org/10.4088/JCP.20f13804
  • Andrejevic, M., & Selwyn, N. (2020). Facial recognition technology in schools: Critical questions and concerns. Learning, Media and Technology, 45(2), 115-128. https://doi.org/10.1080/17439884.2020.1686014
  • Boateng, G. O., Neilands, T. B., Frongillo, E. A., Melgar-Quiñonez, H. R., & Young, S. L. (2018). Best practices for developing and validating scales for health, social, and behavioral research: a primer. Frontiers in Public Health, 6, 149. https://doi.org/10.3389/fpubh.2018.00149
  • Booth, C., Wong, E., Brown, M., & Fitzsimons, E. (2024). Evaluating the effect of monetary incentives on web survey response rates in the UK Millennium Cohort Study. Survey Research Methods, 18(1), 47-58. https://doi.org/10.18148/srm/2024.v18i1.8210
  • Chang, M., Büchel, D., Reinecke, K., Lehmann, T., & Baumeister, J. (2022). Ecological validity in exercise neuroscience research: A systematic investigation. European Journal of Neuroscience, 55(2), 487-509. https://doi.org/10.1111/ejn.15595
  • Dalton, D., & Ortegren, M. (2011). Gender differences in ethics research: The importance of controlling for the social desirability response bias. Journal of Business Ethics, 103, 73-93. https://doi.org/10.1007/s10551-011-0843-8
  • Danziger, S., Levav, J., & Avnaim-Pesso, L. (2011). Extraneous factors in judicial decisions. Proceedings of the National Academy of Sciences, 108(17), 6889-6892. https://doi.org/10.1073/pnas.1018033108
  • Demir, E. K. (2023). Investigating the effect of item order on the psychometric properties of a self-efficacy perception scale. E-International Journal of Educational Research, 14(5). https://doi.org/10.19160/e-ijer.1362442
  • Demirkol, S., & Kelecioglu, H. (2022). Investigating the effect of item position on person and item parameters: PISA 2015 Turkey sample. Journal of Measurement and Evaluation in Education and Psychology, 13(1), 69-85. https://doi.org/10.21031/epod.958576
  • DeSimone, J. A., Harms, P. D., & DeSimone, A. J. (2015). Best practice recommendations for data screening. Journal of Organizational Behavior, 36 (2), 171–181. https://doi.org/10.1002/job.1962
  • DeSimone, J. A., & Harms, P. D. (2018). Dirty data: The effects of screening respondents who provide low-quality data in survey research. Journal of Business and Psychology, 33, 559-577. https://doi.org/10.1007/s10869-017-9514-9
  • Dönmez, O., & Akbulut, Y. (2016). Social desirability bias in cyberbullying research. Educational Technology: Theory and Practice, 6(2), 1-18. https://doi.org/10.17943/etku.19033
  • Dönmez, O., Akbulut, Y., Zabzun, G., & Köseoğlu, B. (2025). Effects of survey order on subjective measures of cognitive load: A randomized controlled trial. Applied Cognitive Psychology, 39(2), e70039. https://doi.org/10.1002/acp.70039
  • Erdem-Kara, B., & Akbulut, Y. (2025). Prevalence and psychometric implications of careless responses in an online student survey. Journal of Psychoeducational Assessment, 07342829251328132. https://doi.org/10.1177/07342829251328132
  • Goodman, J., Anson, R., & Belcheir, M. (2015). The effect of incentives and other instructor-driven strategies to increase online student evaluation response rates. Assessment & Evaluation in Higher Education,  40(7), 958-970. https://doi.org/10.1080/02602938.2014.960364
  • Grieve, R., & Elliott, J. (2013). Cyberfaking: I can, so I will? Intentions to fake in online psychological testing. Cyberpsychology, Behavior, and Social Networking, 16(5), 364–369. https://doi.org/10.1089/cyber.2012.0271
  • Hambleton, R. K., & Traub, R. E. (1974). The effects of item order on test performance and stress. The Journal of Experimental Education, 43(1), 40-46. https://doi.org/10.1080/00220973.1974.10806302
  • Heggestad, E. D., Scheaf, D. J., Banks, G. C., Monroe Hausfeld, M., Tonidandel, S., & Williams, E. B. (2019). Scale adaptation in organizational science research: A review and best-practice recommendations. Journal of Management, 45(6), 2596-2627. https://doi.org/10.1177/0149206319850280
  • Huang, F. L., & Cornell, D. G. (2016). Question order affects the measurement of bullying victimization among middle school students. Educational and Psychological Measurement, 76(5), 724–740. https://doi.org/10.1177/0013164415622664
  • Iaconelli, R., & Wolters, C. A. (2020). Insufficient effort responding in surveys assessing self-regulated learning: Nuisance or fatal flaw?. Frontline Learning Research, 8(3), 104-125. https://doi.org/10.14786/flr.v8i3.521
  • King, M. F., & Bruner, G. C. (2000). Social desirability bias: A neglected aspect of validity testing. Psychology and Marketing, 17(2), 79-103. https://doi.org/10.1002/(SICI)1520-6793(200002)17:2<79::AID-MAR2>3.0.CO;2-0
  • Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5(3), 213-36. https://doi.org/10.1002/acp.2350050305
  • Krosnick, J. A., & Alwin, D. F. (1987). An evaluation of a cognitive theory of response-order effects in survey measurement. Public Opinion Quarterly, 51(2), 201-219. https://doi.org/10.1086/269029
  • Newman, D. L., Kundert, D. K., Lane Jr, D. S., & Bull, K. S. (1988). Effect of varying item order on multiple-choice test scores: Importance of statistical and cognitive difficulty. Applied Measurement in Education, 1(1), 89-97. https://doi.org/10.1207/s15324818ame0101_8
  • Örün, Ö., & Akbulut, Y. (2019). Effect of multitasking, physical environment and electroencephalography use on cognitive load and retention. Computers in Human Behavior, 92, 216-229. https://doi.org/10.1016/j.chb.2018.11.027
  • Rasinski, K. A., Lee, L., & Krishnamurty, P. (2012). Question order effects. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology, Vol. 1. Foundations, planning, measures, and psychometrics (pp. 229–248). American Psychological Association. https://doi.org/10.1037/13619-014
  • Roos, A. L., Goetz, T., Voracek, M., Krannich, M., Bieg, M., Jarrell, A., & Pekrun, R. (2021). Test anxiety and physiological arousal: A systematic review and meta-analysis. Educational Psychology Review, 33, 579-618. https://doi.org/10.1007/s10648-020-09543-z
  • Siddiq, F., & Scherer, R. (2019). Is there a gender gap? A meta-analysis of the gender differences in students' ICT literacy. Educational Research Review, 27, 205-217. https://doi.org/10.1016/j.edurev.2019.03.007
  • Skulmowski, A. (2023). Learners emphasize their intrinsic load if asked about it first: communicative aspects of cognitive load measurement. Mind, Brain, and Education, 17(3), 161-218. https://doi.org/10.1111/mbe.12369
  • Stark, T. H., Silber, H., Krosnick, J. A., Blom, A. G., Aoyagi, M., Belchior, A., Bosnjak, M., Clement, S. L., John, M., Jónsdóttir, G.A., Lawson, K., Lynn, P., Martinsson, J., Shamshiri-Petersen, D., Tvinnereim, E., & Yu, R. R. (2020). Generalization of classic question order effects across cultures. Sociological Methods & Research, 49(3), 567-602. https://doi.org/10.1177/0049124117747304
  • Statista. (2023, September 29). Turkey: most popular sports 2013. https://www.statista.com/statistics/677503/most-followed-sports-turkey/
  • Stefkovics, Á., & Kmetty, Z. (2022). A comparison of question order effects on item-by-item and grid formats: visual layout matters. Measurement Instruments for the Social Sciences, 4(8) 1-12. https://doi.org/10.1186/s42409-022-00036-z
  • Steger, D., Jankowsky, K., Schroeders, U., & Wilhelm, O. (2023). The road to hell is paved with good intentions: How common practices in scale construction hurt validity. Assessment, 30(6), 1811-1824. https://doi.org/10.1177/10731911221124846
  • Şahin, M. D. (2021). Effect of item order on certain psychometric properties: A demonstration on a cyberloafing scale. Frontiers in Psychology, 12, 590545.https://doi.org/10.3389/fpsyg.2021.590545
  • Thorndike, F. P., Carlbring, P., Smyth, F. L., Magee, J. C., Gonder-Frederick, L., Ost, L. G., & Ritterband, L. M. (2009). Web-based measurement: Effect of completing single or multiple items per webpage. Computers in Human Behavior, 25(2), 393-401. https://doi.org/10.1016/j.chb.2008.05.006
  • Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1131. https://doi.org/10.1126/science.185.4157.1124
  • Weinberg, M. K., Seton, C., & Cameron, N. (2018). The measurement of subjective wellbeing: Item-order effects in the personal wellbeing index—adult. Journal of Happiness Studies, 19 (1), 315-332. https://doi.org/10.1007/s10902-016-9822-1.
  • Wise, S. L., Kuhfeld, M. R., & Lindner, M. A. (2024). Don’t test after lunch: The relationship between disengagement and the time of day that low-stakes testing occurs. Applied Measurement in Education, 37(1), 14–28. https://doi.org/10.1080/08957347.2024.2311925
  • Yedukondalu, J., Sunkara, K., Radhika, V., Kondaveeti, S., Anumothu, M., & Murali Krishna, Y. (2025). Cognitive load detection through EEG lead wise feature optimization and ensemble classification. Scientific Reports, 15(1), 842. https://doi.org/10.1038/s41598-024-84429-6
  • Yörük-Açıkel, B., Turhan, U., & Akbulut, Y. (2018). Effect of multitasking on simulator sickness and performance in 3D aerodrome control training. Simulation & Gaming, 49 (1), 27-49. https://doi.org/10.1177/1046878117750417
  • Xue, Y., Wang, K., & Qiu, Y. (2024). Enhancing online learning: A multimodal approach for cognitive load assessment. International Journal of Human-Computer Interaction. Advance online publication. https://doi.org/10.1080/10447318.2024.2327198
  • Zhou, M., & Winne, P.H. (2012). Modeling academic achievement by self-reported versus traced goal orientation. Learning and Instruction, 22(6), 413-419. https://doi.org/10.1016/j.learninstruc.2012.03.004
  • Zhu, W., Wang, F., Mayer, R. E., & Liu, T. (2024). Effects of explaining a science lesson to others or to oneself: A cognitive neuroscience approach. Learning and Instruction, 91, 101897. https://doi.org/10.1016/j.learninstruc.2024.101897
There are 49 citations in total.

Details

Primary Language English
Subjects Testing, Assessment and Psychometrics (Other)
Journal Section Articles
Authors

Yavuz Akbulut 0000-0003-1457-8009

Publication Date June 30, 2025
Submission Date January 31, 2025
Acceptance Date May 27, 2025
Published in Issue Year 2025 Volume: 16 Issue: 2

Cite

APA Akbulut, Y. (2025). Beyond self-reports: Addressing bias and improving data quality in educational research. Journal of Measurement and Evaluation in Education and Psychology, 16(2), 115-123. https://doi.org/10.21031/epod.1630477