An Effective Way to Provide Item Validity: Examining Student Response Processes
Abstract
Studies
based on response processes of individuals can provide information that
supports the assessment and increases the validity of the items in the scale or
tests. The purpose of this study is to present the extent to which the student response
processes are effective in identifying and developing the characteristics of
the items in an achievement test and in collecting validity evidence. For this
purpose, 28 Turkish fourth-grade students were chosen, half were high-achieving
students and the remaining half were low-achieving students. The items for the
study were chosen from the Trends in International Mathematics and Science
Study TIMSS 2007 and 2011 by taking into consideration several item
characteristics. Before cognitive interviews, an interview guide was also
prepared. In the study, it was determined that cognitive interviews, especially
those conducted with the high-achieving students, can serve to develop item
validity. In the cognitive interviews with the low-achieving students, information
was gathered concerning how students who did not have specific knowledge
measured with an item were able to respond to that item.
Keywords
References
- AERA, APA, & NCME (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
- AERA, APA, & NCME (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
- Beddow, P. A., Elliott, S. N., & Kettler, R. J. (2013). Test accessibility: Item reviews and lessons learned from four state assessments. Education Research International, 2013, 1-12. doi:10.1155/2013/952704
- Benitez, I., & Padilla, J. L. (2013). Analysis of nonequivalent assessments across different linguistic groups using a mixed methods approach: Understanding the causes of differential item functioning by cognitive interviewing. Journal of Mixed Methods Research, 8(1) 52-6. doi: 10.1177/1558689813488245
- Borsboom, D., Mellenbergh, G. J., & Heerden, J. (2004). The concept of validity. Psychological Review, 111(4), 1061-71. doi: 10.1037/0033-295X.111.4.1061
- Bowen, N. K., Bowen, G. L., & Woolley, M. E. (2004). Constructing and validating assessment tools for school-based practitioners: The Elementary School Success Profile. In A. R. Roberts & K. Y. Yeager (Eds.) Evidence-based practice manual: Research and outcome measures in health and human services (pp. 509-517). New York: Oxford University Press.
- Conrad, F., & Blair, J. (2004). Aspects of data quality in cognitive interviews: The case of verbal reports. In S. Presser, J. Rothgeb, M. Couper, J. Lessler, E. Martin, J. Martin, et al. (Eds.) Questionnaire development, evaluation and testing methods (pp. 67-88). New York: Wiley.
- Cronbach, L. J. (1984). Essentials of psychological testing. NY: Harper.
Details
Primary Language
English
Subjects
Studies on Education
Journal Section
Research Article
Publication Date
March 21, 2019
Submission Date
July 25, 2018
Acceptance Date
November 19, 2018
Published in Issue
Year 2019 Volume: 6 Number: 1
Cited By
Towards an Online Self-Assessment for Informed Study Decisions–A Mixed-Methods Validation Study
International Journal of Assessment Tools in Education
https://doi.org/10.21449/ijate.949160Adaptation of the Shared-Metacognition Questionnaire (SMQ) into Turkish for Online Collaborative Learning Environments
Journal of Educational Technology and Online Learning
https://doi.org/10.31681/jetol.1106008