Research Article
BibTex RIS Cite

Towards an Online Self-Assessment for Informed Study Decisions–A Mixed-Methods Validation Study

Year 2022, Volume: 9 Issue: 2, 376 - 396, 26.06.2022

Abstract

Informed study decisions are pivotal for student retention in higher online education. A self-assessment prior to enrolment has been proposed as a promising approach to enable informed decision-making and to build resources for retention. To determine whether such a self-assessment affects the decision-making process as intended, thorough and careful validation is a necessity. This study reports on two validity aspects that are less commonly addressed in that respect, but essential for evaluating effectiveness: response processes and consequences of (self-) testing. To map the response processes and consequences of the current self-assessment, a mixed-methods approach was used in which eight prospective students took a self-assessment in an observed think-aloud mode and were interviewed before and after that. Results show different response processes depending on the type of subtest that is taken. The results also indicate that consequential aspect of validity must be considered in the context of decision-making phases. The demonstrated evidence and possible threats to validity are discussed in light of refining the self-assessment and embedding it in counselling practice.

References

  • Abbasi, N., & Ghosh, S. (2020). Construction and standardization of examination anxiety scale for adolescent students. International Journal of Assessment Tools in Education, 7(4), 522-534. https://doi.org/10.21449/ijate.793084.
  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. American Psychological Association.
  • Bandura, A. (1977). Self efficacy: toward a unifying theory of behavioral change. Psychological Review, 84(2), 191-215.
  • Beckman, T.J., Cook et al., D.A., & Mandrekar, J.N. (2005). What is the validity evidence for assessments of clinical teaching? Journal of General Internal Medicine, 20(12), 1159-1164. https://doi.org/10.1111/j.1525-1497.2005.0258.x
  • Borghans, L., Golsteyn, B., & Stenberg, A. (2015). Does expert advice improve educational choice? PLoS One, 10(12), Article e0145378. https://doi.org/10.1371/journal.pone.0145378
  • Broos, T., Verbert, K., Langie, G., Van Soom, C., & De Laet, T. (2018). Multi-institutional positioning test feedback dashboard for aspiring students: lessons learnt from a case study in Flanders. Proceedings of the 8th International Conference on learning Analytics and Knowledge, 51-55. https://doi.org/10.1145/3170358.3170419
  • Broos, T., Pinxten, M., Delporte, M., Verbert, K., & De Laet, T. (2019). Learning dashboards at scale: early warning and overall first year experience. Assessment & Evaluation in Higher Education, 45(6), 1–20. https://doi.org/10.1080/02602938.2019.1689546
  • Charters, E. (2003). The use of think-aloud methods in qualitative research an introduction to think-aloud methods. Brock Education: A Journal of Educational Research and Practice, 12(2), 68-82. https://doi.org/10.26522/brocked.v12I2.38
  • Cobern, W.W., & Adams, B.A. (2020). When interviewing: how many is enough?. International Journal of Assessment Tools in Education, 7(1), 73 79. https://dx.doi.org/10.21449/ijate.693217.
  • Cohen, A.D. (2006). The coming age of research on test-taking strategies. Language Assessment Quarterly, 3(4), 307-331. https://doi.org/10.1080/15434300701333129
  • Cook, D.A., Zendejas, B., Hamstra, S. J., Hatala, R., & Brydges, R. (2014). What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment. Advances in Health Sciences Education, 19(2), 233 250. https://doi.org/10.1007/s10459-013-9458-4
  • Creswell, J.W. (2014). Educational research: Planning, conducting, and evaluating quantitative and qualitative research (4th ed.). Edinburgh Gate: Pearson Education Limited.
  • Delnoij, L.E.C., Dirkx, K.J.H., Janssen, J.P.W., & Martens, R.L. (2020a). Predicting and resolving non-completion in higher (online) education – A literature review. Educational Research Review, 29, 100313. https://doi.org/10.1016/j.edurev.2020.100313
  • Delnoij, L.E.C., Janssen, J.P.W., Dirkx, K.J.H., Gijselaers, H.J.M., De Groot, R.H.M., Neroni, J., De Bie, M., & Martens, R.L. (2021). Predicting completion: The road to informed study decisions in higher online education. Frontiers in Education, 6. https://doi.org/10.3389/feduc.2021.668922
  • Delnoij, L.E.C., Janssen, J.P.W., Dirkx, K.J.H., Martens, R.L. (2020b) Designing an online self-assessment for informed study decisions: The user perspective. In C. Alario-Hoyos, M. J. Rodríguez-Triana, M. Scheffel, I. Arnedillo-Sánchez, & S.M. Dennerlein (Eds), Lecture Notes in Computer Science: Vol. 12315. Addressing Global Challenges and Quality Education. Springer.
  • Downing, S.M. (2003). Validity: on the meaningful interpretation of assessment data. Medical Education, 37(9), 830-837. https://doi.org/10.1046/j.1365-2923.2003.01594.x
  • Downing, S.M., & Haladyna, T.M. (2004). Validity threats: overcoming interference with proposed interpretations of assessment data. Medical Education, 38(3), 327-333. https://doi.org/10.1046/j.1365-2923.2004.01777.x
  • Dowker, A., Sarkar, A., & Looi, C.Y. (2016). Mathematics anxiety: What have we learned in 60 years?. Frontiers in Psychology, 7. https://doi.org/10.3389/fpsyg.2016.00508
  • Dunning, D., Heath, C., & Suls, J.M. (2004). Flawed self-assessment: Implications for health, education, and the workplace. Psychological Science in the Public Interest, 5(3), 69–106. https://doi.org/10.1111/j.1529-1006.2004.00018.x
  • Erlingsson, C., & Brysiewicz, P. (2017). A hands-on guide to doing content analysis. African Journal of Emergency Medicine, 7(3), 93 99. https://doi.org/10.1016/j.afjem.2017.08.001
  • Fonteyne, L., & Duyck, W. (2015). Vraag het aan SIMON! [Ask SIMON!]. Thema Hoger Onderwijs, 2, 56-60.
  • Goodwin, L.D., & Leech, N.L. (2003). The meaning of validity in the new standards for educational and psychological testing: Implications for measurement courses. Measurement & Evaluation in Counseling & Development, 36(3), 181 191. https://doi.org/10.1080/07481756.2003.11909741
  • Guba, E.G. (1981). Criteria for assessing the trustworthiness of naturalistic inquiries. Educational Communication and Technology, 29(2), 75-91.
  • Hartnett, M. (2016). The importance of motivation in online learning. In Harnett, M. (Ed.) Motivation in online education (pp. 5 32). Springer. https://doi.org/10.1007/978-981-10-0700-2_2
  • Jivet, I., Scheffel, M., Schmitz, M., Robbers, S., Specht, M., & Drachsler, H. (2020). From students with love: An empirical study on learner goals, self-regulated learning and sense-making of learning analytics in higher education. The Internet and Higher Education, 47. https://doi.org/10.1016/j.iheduc.2020.100758
  • Krefting, L. (1991). Rigor in qualitative research: The assessment of trustworthiness. American Journal of Occupational Therapy, 45(3), 214-222.
  • Kubinger, K.D., Frebort, M., & Müller, C. (2012). Self-assessment im rahmen der studientberatung: Möglichkeiten und Grenzen. In: Kubinger, K.D., Frebort, M., Khorramdel, L., & Weitensfelder, L. (eds). Self-Assessment: Theorie und Konzepte, 9-24. [Self-Assessment: Theory and Concepts] Lengerich: Pabst Science Publishers.
  • Kutlu, O., & Yavuz, H. C. (2019). An Effective way to provide item validity: Examining student response processes. International Journal of Assessment Tools in Education, 6(1), 9-24. https://doi.org/10.21449/ijate.447780
  • Liebert, R.M., & Morris, L.W. (1967). Cognitive and emotional components of test anxiety: a distinction and some initial data. Psychological Reports, 20(3), 975 978. https://doi.org/10.2466/pr0.1967.20.3.975
  • McCambridge, J., Witton, J., & Elbourne, D.R. (2014). Systematic review of the Hawthorne effect: new concepts are needed to study research participation effects. Journal of Clinical Epidemiology, 67(3), 267-277. https://doi.org/10.1016/j.jclinepi.2013.08.015
  • Messick, S. (1989). Meaning and values in test validation: The science and ethics of assessment. Educational Researcher, 18(2), 5 11. https://onlinelibrary.wiley.com/doi/pdf/10.1002/j.2330-8516.1988.tb00303.x
  • Muljana, P.S., & Luo, T. (2019). Factors contributing to student retention in online learning and recommended strategies for improvement: A systematic literature review. Journal of Information Technology Education: Research, 18. https://doi.org/10.28945/4182
  • Nolden, P., & Wosnitza, M. (2016). Webgestützte selbstreflexion von abbruchrisiken bei studierenden [Web-based self-reflection of drop-out risk among students]. Empirische Pädagogik, 30(3/4), 576-603.
  • Nolden, P., Wosnitza, M., Karabenick, S.A., Peixoto, F., Gonida, E., Ilic, I.S., Almeida, L.S., Stamovlasis, D., Do Céu Taveira, M., Toskoviç, O., Falanga, K., Aivazidis, C., Krstic, K., Videnovic, M., Gouveia, M., Castro Silva, J., Delzepich, R., Holder, L., & Clinton, E. (2019). Enhancing student self-reflection on the situation at university. The SRT scale inventory.
  • Nicol, D. (2009). Assessment for learner self‐regulation: enhancing achievement in the first year using learning technologies. Assessment & Evaluation in Higher Education, 34(3), 335-352. https://doi.org/10.1080/02602930802255139
  • Niessen, A.S.M., & Meijer, R.R. (2017). Voorspellen in het kader van de studiekeuzecheck: Tijd voor verbetering [Predicting in the context of study decisions: time for improvement]. Onderzoek van Onderwijs, 46, 5-7.
  • Niessen, A.S.M., Meijer, R.R., & Tendeiro, J.N. (2017). Measuring non-cognitive predictors in high-stakes contexts: The effect of self-presentation on self-report instruments used in admission to higher education. Personality & Individual Differences 106, 183-189. http://dx.doi.org/10.1016/j.paid.2016.11.014
  • Niessen, A.S.M., Meijer, R.R., & Tendeiro, J.N. (2018). Admission testing for higher education: A multi-cohort study on the validity of high-fidelity curriculum-sampling tests. PloS One, 13(6), e0198746. https://doi.org/10.1371/journal.pone/0198746
  • O’Regan, L., Brown, M., Harding, N., McDermott, G., & Ryan, S. (2016) Technology-enabled feedback in the first year: A synthesis of the literature. http://y1feedback.ie/wpcontent/uploads/2016/04/SynthesisoftheLiterature2016.pdf
  • Padilla, J.L., & Benítez, I. (2014). Validity evidence based on response processes. Psicothema, 26(1), 136-144.
  • Robinson, D.A. (1996). Orientation programs: A foundation for student learning and success. New Directions for Student Services, 75, 55 68. https://doi.org/10.1002/ss.37119967507
  • Royal, K.D. (2017). Four tenets of modern validity theory for medical education assessment and evaluation. Advances in Medical Education and Practice, 8, 567 570. https://doi.org/10.2147/AMEP.S139492
  • Sage, A.J., Cervato, C., Genschel, U., & Ogilvie, C.A. (2018). Combining academics and social engagement: a major-specific early alert method to counter student attrition in science, technology, engineering, and mathematics. Journal of College Student Retention: Research, Theory & Practice, 22(4), 611 626. https://doi.org/10.1177/1521025118780502
  • Shenton, A.K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22(2), 63-75.
  • Sommer, R. (1968). The Hawthorne dogma. Psychological Bulletin, 70(6, Pt.1), 592. https://doi.org/10.1037/h0026728
  • Soppe, K.F.B., Wubbels, T., Leplaa, H.J., Klugkist, I., & Wijngaards-de Meij, L.D.N.V. (2019). Do they match? Prospective students’ experiences with choosing university programmes. European Journal of Higher Education, 9(4), 359 376. https://doi.org/10.1080/21568235.2019.1650088
  • St-Onge, C., Young, M., Eva, K.W., & Hodges, B. (2017). Validity: one word with a plurality of meanings. Advances in Health Sciences Education, 22, 853 867. https://doi.org/10.1007/s10459-016-9716-3
  • Tinto, V. (2005) Taking student success seriously: Rethinking the first year of college. IN: Taking student success seriously: Rethinking the first year of college. Paper Presented at the Ninth Annual Intersession Academic Affairs Forum, California State University, Fullerton.
  • Van den Akker, J., Bannan, B., Kelly, A.E., Nieveen, N., & Plomp, T (2013). Educational design research: An introduction. Educational design research. Enschede.
  • Van Den Haak, M., De Jong, M., & Jan Schellens, P. (2003). Retrospective vs. concurrent think-aloud protocols: testing the usability of an online library catalogue. Behaviour & Information Technology, 22(5), 339-351. https://doi.org/10.1080/0044929031000
  • Viswesvaran, C., & Ones, D.S. (1999). Meta-analyses of fakability estimates: Implications for personality measurement. Educational and Psychological Measurement, 59, 197–210. http://dx.doi.org/10.1177/00131649921969802.
  • Vossensteyn H., Kottmann, A., Jongbloed, B.W.A., Kaiser, F., Cremonini, L, Stensaker, B., Hovdhaugen, E., & Wollscheid, S. (2015). Dropout and completion in higher education in Europe: Main report. https://research.utwente.nl/en/publications/dropout-and-completion-in-higher-education-in-europe-main-report.
  • Wachen, J., Pretlow, J., & Dixon, K.G. (2016). Building college readiness: exploring the effectiveness of the unc academic summer bridge program. Journal of College Student Retention: Research, Theory & Practice, 20(1), 116 138. https://doi.org/10.1177/1521025116649739
  • Zhang, X., Gossett, C., Simpson, J., & Davis, R. (2019). Advising students for success in higher education: An all-out effort. Journal of College Student Retention: Research, Theory & Practice, 21(1), 53-77. https://doi.org/10.1177/1521025116689097

Towards an Online Self-Assessment for Informed Study Decisions–A Mixed-Methods Validation Study

Year 2022, Volume: 9 Issue: 2, 376 - 396, 26.06.2022

Abstract

Informed study decisions are pivotal for student retention in higher online education. A self-assessment prior to enrolment has been proposed as a promising approach to enable informed decision-making and to build resources for retention. To determine whether such a self-assessment affects the decision-making process as intended, thorough and careful validation is a necessity. This study reports on two validity aspects that are less commonly addressed in that respect, but essential for evaluating effectiveness: response processes and consequences of (self-) testing. To map the response processes and consequences of the current self-assessment, a mixed-methods approach was used in which eight prospective students took a self-assessment in an observed think-aloud mode and were interviewed before and after that. Results show different response processes depending on the type of subtest that is taken. The results also indicate that consequential aspect of validity must be considered in the context of decision-making phases. The demonstrated evidence and possible threats to validity are discussed in light of refining the self-assessment and embedding it in counselling practice.

References

  • Abbasi, N., & Ghosh, S. (2020). Construction and standardization of examination anxiety scale for adolescent students. International Journal of Assessment Tools in Education, 7(4), 522-534. https://doi.org/10.21449/ijate.793084.
  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. American Psychological Association.
  • Bandura, A. (1977). Self efficacy: toward a unifying theory of behavioral change. Psychological Review, 84(2), 191-215.
  • Beckman, T.J., Cook et al., D.A., & Mandrekar, J.N. (2005). What is the validity evidence for assessments of clinical teaching? Journal of General Internal Medicine, 20(12), 1159-1164. https://doi.org/10.1111/j.1525-1497.2005.0258.x
  • Borghans, L., Golsteyn, B., & Stenberg, A. (2015). Does expert advice improve educational choice? PLoS One, 10(12), Article e0145378. https://doi.org/10.1371/journal.pone.0145378
  • Broos, T., Verbert, K., Langie, G., Van Soom, C., & De Laet, T. (2018). Multi-institutional positioning test feedback dashboard for aspiring students: lessons learnt from a case study in Flanders. Proceedings of the 8th International Conference on learning Analytics and Knowledge, 51-55. https://doi.org/10.1145/3170358.3170419
  • Broos, T., Pinxten, M., Delporte, M., Verbert, K., & De Laet, T. (2019). Learning dashboards at scale: early warning and overall first year experience. Assessment & Evaluation in Higher Education, 45(6), 1–20. https://doi.org/10.1080/02602938.2019.1689546
  • Charters, E. (2003). The use of think-aloud methods in qualitative research an introduction to think-aloud methods. Brock Education: A Journal of Educational Research and Practice, 12(2), 68-82. https://doi.org/10.26522/brocked.v12I2.38
  • Cobern, W.W., & Adams, B.A. (2020). When interviewing: how many is enough?. International Journal of Assessment Tools in Education, 7(1), 73 79. https://dx.doi.org/10.21449/ijate.693217.
  • Cohen, A.D. (2006). The coming age of research on test-taking strategies. Language Assessment Quarterly, 3(4), 307-331. https://doi.org/10.1080/15434300701333129
  • Cook, D.A., Zendejas, B., Hamstra, S. J., Hatala, R., & Brydges, R. (2014). What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment. Advances in Health Sciences Education, 19(2), 233 250. https://doi.org/10.1007/s10459-013-9458-4
  • Creswell, J.W. (2014). Educational research: Planning, conducting, and evaluating quantitative and qualitative research (4th ed.). Edinburgh Gate: Pearson Education Limited.
  • Delnoij, L.E.C., Dirkx, K.J.H., Janssen, J.P.W., & Martens, R.L. (2020a). Predicting and resolving non-completion in higher (online) education – A literature review. Educational Research Review, 29, 100313. https://doi.org/10.1016/j.edurev.2020.100313
  • Delnoij, L.E.C., Janssen, J.P.W., Dirkx, K.J.H., Gijselaers, H.J.M., De Groot, R.H.M., Neroni, J., De Bie, M., & Martens, R.L. (2021). Predicting completion: The road to informed study decisions in higher online education. Frontiers in Education, 6. https://doi.org/10.3389/feduc.2021.668922
  • Delnoij, L.E.C., Janssen, J.P.W., Dirkx, K.J.H., Martens, R.L. (2020b) Designing an online self-assessment for informed study decisions: The user perspective. In C. Alario-Hoyos, M. J. Rodríguez-Triana, M. Scheffel, I. Arnedillo-Sánchez, & S.M. Dennerlein (Eds), Lecture Notes in Computer Science: Vol. 12315. Addressing Global Challenges and Quality Education. Springer.
  • Downing, S.M. (2003). Validity: on the meaningful interpretation of assessment data. Medical Education, 37(9), 830-837. https://doi.org/10.1046/j.1365-2923.2003.01594.x
  • Downing, S.M., & Haladyna, T.M. (2004). Validity threats: overcoming interference with proposed interpretations of assessment data. Medical Education, 38(3), 327-333. https://doi.org/10.1046/j.1365-2923.2004.01777.x
  • Dowker, A., Sarkar, A., & Looi, C.Y. (2016). Mathematics anxiety: What have we learned in 60 years?. Frontiers in Psychology, 7. https://doi.org/10.3389/fpsyg.2016.00508
  • Dunning, D., Heath, C., & Suls, J.M. (2004). Flawed self-assessment: Implications for health, education, and the workplace. Psychological Science in the Public Interest, 5(3), 69–106. https://doi.org/10.1111/j.1529-1006.2004.00018.x
  • Erlingsson, C., & Brysiewicz, P. (2017). A hands-on guide to doing content analysis. African Journal of Emergency Medicine, 7(3), 93 99. https://doi.org/10.1016/j.afjem.2017.08.001
  • Fonteyne, L., & Duyck, W. (2015). Vraag het aan SIMON! [Ask SIMON!]. Thema Hoger Onderwijs, 2, 56-60.
  • Goodwin, L.D., & Leech, N.L. (2003). The meaning of validity in the new standards for educational and psychological testing: Implications for measurement courses. Measurement & Evaluation in Counseling & Development, 36(3), 181 191. https://doi.org/10.1080/07481756.2003.11909741
  • Guba, E.G. (1981). Criteria for assessing the trustworthiness of naturalistic inquiries. Educational Communication and Technology, 29(2), 75-91.
  • Hartnett, M. (2016). The importance of motivation in online learning. In Harnett, M. (Ed.) Motivation in online education (pp. 5 32). Springer. https://doi.org/10.1007/978-981-10-0700-2_2
  • Jivet, I., Scheffel, M., Schmitz, M., Robbers, S., Specht, M., & Drachsler, H. (2020). From students with love: An empirical study on learner goals, self-regulated learning and sense-making of learning analytics in higher education. The Internet and Higher Education, 47. https://doi.org/10.1016/j.iheduc.2020.100758
  • Krefting, L. (1991). Rigor in qualitative research: The assessment of trustworthiness. American Journal of Occupational Therapy, 45(3), 214-222.
  • Kubinger, K.D., Frebort, M., & Müller, C. (2012). Self-assessment im rahmen der studientberatung: Möglichkeiten und Grenzen. In: Kubinger, K.D., Frebort, M., Khorramdel, L., & Weitensfelder, L. (eds). Self-Assessment: Theorie und Konzepte, 9-24. [Self-Assessment: Theory and Concepts] Lengerich: Pabst Science Publishers.
  • Kutlu, O., & Yavuz, H. C. (2019). An Effective way to provide item validity: Examining student response processes. International Journal of Assessment Tools in Education, 6(1), 9-24. https://doi.org/10.21449/ijate.447780
  • Liebert, R.M., & Morris, L.W. (1967). Cognitive and emotional components of test anxiety: a distinction and some initial data. Psychological Reports, 20(3), 975 978. https://doi.org/10.2466/pr0.1967.20.3.975
  • McCambridge, J., Witton, J., & Elbourne, D.R. (2014). Systematic review of the Hawthorne effect: new concepts are needed to study research participation effects. Journal of Clinical Epidemiology, 67(3), 267-277. https://doi.org/10.1016/j.jclinepi.2013.08.015
  • Messick, S. (1989). Meaning and values in test validation: The science and ethics of assessment. Educational Researcher, 18(2), 5 11. https://onlinelibrary.wiley.com/doi/pdf/10.1002/j.2330-8516.1988.tb00303.x
  • Muljana, P.S., & Luo, T. (2019). Factors contributing to student retention in online learning and recommended strategies for improvement: A systematic literature review. Journal of Information Technology Education: Research, 18. https://doi.org/10.28945/4182
  • Nolden, P., & Wosnitza, M. (2016). Webgestützte selbstreflexion von abbruchrisiken bei studierenden [Web-based self-reflection of drop-out risk among students]. Empirische Pädagogik, 30(3/4), 576-603.
  • Nolden, P., Wosnitza, M., Karabenick, S.A., Peixoto, F., Gonida, E., Ilic, I.S., Almeida, L.S., Stamovlasis, D., Do Céu Taveira, M., Toskoviç, O., Falanga, K., Aivazidis, C., Krstic, K., Videnovic, M., Gouveia, M., Castro Silva, J., Delzepich, R., Holder, L., & Clinton, E. (2019). Enhancing student self-reflection on the situation at university. The SRT scale inventory.
  • Nicol, D. (2009). Assessment for learner self‐regulation: enhancing achievement in the first year using learning technologies. Assessment & Evaluation in Higher Education, 34(3), 335-352. https://doi.org/10.1080/02602930802255139
  • Niessen, A.S.M., & Meijer, R.R. (2017). Voorspellen in het kader van de studiekeuzecheck: Tijd voor verbetering [Predicting in the context of study decisions: time for improvement]. Onderzoek van Onderwijs, 46, 5-7.
  • Niessen, A.S.M., Meijer, R.R., & Tendeiro, J.N. (2017). Measuring non-cognitive predictors in high-stakes contexts: The effect of self-presentation on self-report instruments used in admission to higher education. Personality & Individual Differences 106, 183-189. http://dx.doi.org/10.1016/j.paid.2016.11.014
  • Niessen, A.S.M., Meijer, R.R., & Tendeiro, J.N. (2018). Admission testing for higher education: A multi-cohort study on the validity of high-fidelity curriculum-sampling tests. PloS One, 13(6), e0198746. https://doi.org/10.1371/journal.pone/0198746
  • O’Regan, L., Brown, M., Harding, N., McDermott, G., & Ryan, S. (2016) Technology-enabled feedback in the first year: A synthesis of the literature. http://y1feedback.ie/wpcontent/uploads/2016/04/SynthesisoftheLiterature2016.pdf
  • Padilla, J.L., & Benítez, I. (2014). Validity evidence based on response processes. Psicothema, 26(1), 136-144.
  • Robinson, D.A. (1996). Orientation programs: A foundation for student learning and success. New Directions for Student Services, 75, 55 68. https://doi.org/10.1002/ss.37119967507
  • Royal, K.D. (2017). Four tenets of modern validity theory for medical education assessment and evaluation. Advances in Medical Education and Practice, 8, 567 570. https://doi.org/10.2147/AMEP.S139492
  • Sage, A.J., Cervato, C., Genschel, U., & Ogilvie, C.A. (2018). Combining academics and social engagement: a major-specific early alert method to counter student attrition in science, technology, engineering, and mathematics. Journal of College Student Retention: Research, Theory & Practice, 22(4), 611 626. https://doi.org/10.1177/1521025118780502
  • Shenton, A.K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22(2), 63-75.
  • Sommer, R. (1968). The Hawthorne dogma. Psychological Bulletin, 70(6, Pt.1), 592. https://doi.org/10.1037/h0026728
  • Soppe, K.F.B., Wubbels, T., Leplaa, H.J., Klugkist, I., & Wijngaards-de Meij, L.D.N.V. (2019). Do they match? Prospective students’ experiences with choosing university programmes. European Journal of Higher Education, 9(4), 359 376. https://doi.org/10.1080/21568235.2019.1650088
  • St-Onge, C., Young, M., Eva, K.W., & Hodges, B. (2017). Validity: one word with a plurality of meanings. Advances in Health Sciences Education, 22, 853 867. https://doi.org/10.1007/s10459-016-9716-3
  • Tinto, V. (2005) Taking student success seriously: Rethinking the first year of college. IN: Taking student success seriously: Rethinking the first year of college. Paper Presented at the Ninth Annual Intersession Academic Affairs Forum, California State University, Fullerton.
  • Van den Akker, J., Bannan, B., Kelly, A.E., Nieveen, N., & Plomp, T (2013). Educational design research: An introduction. Educational design research. Enschede.
  • Van Den Haak, M., De Jong, M., & Jan Schellens, P. (2003). Retrospective vs. concurrent think-aloud protocols: testing the usability of an online library catalogue. Behaviour & Information Technology, 22(5), 339-351. https://doi.org/10.1080/0044929031000
  • Viswesvaran, C., & Ones, D.S. (1999). Meta-analyses of fakability estimates: Implications for personality measurement. Educational and Psychological Measurement, 59, 197–210. http://dx.doi.org/10.1177/00131649921969802.
  • Vossensteyn H., Kottmann, A., Jongbloed, B.W.A., Kaiser, F., Cremonini, L, Stensaker, B., Hovdhaugen, E., & Wollscheid, S. (2015). Dropout and completion in higher education in Europe: Main report. https://research.utwente.nl/en/publications/dropout-and-completion-in-higher-education-in-europe-main-report.
  • Wachen, J., Pretlow, J., & Dixon, K.G. (2016). Building college readiness: exploring the effectiveness of the unc academic summer bridge program. Journal of College Student Retention: Research, Theory & Practice, 20(1), 116 138. https://doi.org/10.1177/1521025116649739
  • Zhang, X., Gossett, C., Simpson, J., & Davis, R. (2019). Advising students for success in higher education: An all-out effort. Journal of College Student Retention: Research, Theory & Practice, 21(1), 53-77. https://doi.org/10.1177/1521025116689097
There are 54 citations in total.

Details

Primary Language English
Subjects Studies on Education
Journal Section Articles
Authors

Laurie Delnoij 0000-0001-6363-5714

José Janssen This is me 0000-0002-5104-7648

Kim Dirkx 0000-0001-8014-0916

Rob Martens This is me 0000-0001-7193-8125

Early Pub Date April 28, 2022
Publication Date June 26, 2022
Submission Date September 9, 2021
Published in Issue Year 2022 Volume: 9 Issue: 2

Cite

APA Delnoij, L., Janssen, J., Dirkx, K., Martens, R. (2022). Towards an Online Self-Assessment for Informed Study Decisions–A Mixed-Methods Validation Study. International Journal of Assessment Tools in Education, 9(2), 376-396.

23824         23823             23825