Research Article

An Effective Way to Provide Item Validity: Examining Student Response Processes

Volume: 6 Number: 1 March 21, 2019
TR EN

An Effective Way to Provide Item Validity: Examining Student Response Processes

Abstract

Studies based on response processes of individuals can provide information that supports the assessment and increases the validity of the items in the scale or tests. The purpose of this study is to present the extent to which the student response processes are effective in identifying and developing the characteristics of the items in an achievement test and in collecting validity evidence. For this purpose, 28 Turkish fourth-grade students were chosen, half were high-achieving students and the remaining half were low-achieving students. The items for the study were chosen from the Trends in International Mathematics and Science Study TIMSS 2007 and 2011 by taking into consideration several item characteristics. Before cognitive interviews, an interview guide was also prepared. In the study, it was determined that cognitive interviews, especially those conducted with the high-achieving students, can serve to develop item validity. In the cognitive interviews with the low-achieving students, information was gathered concerning how students who did not have specific knowledge measured with an item were able to respond to that item.

Keywords

References

  1. AERA, APA, & NCME (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
  2. AERA, APA, & NCME (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
  3. Beddow, P. A., Elliott, S. N., & Kettler, R. J. (2013). Test accessibility: Item reviews and lessons learned from four state assessments. Education Research International, 2013, 1-12. doi:10.1155/2013/952704
  4. Benitez, I., & Padilla, J. L. (2013). Analysis of nonequivalent assessments across different linguistic groups using a mixed methods approach: Understanding the causes of differential item functioning by cognitive interviewing. Journal of Mixed Methods Research, 8(1) 52-6. doi: 10.1177/1558689813488245
  5. Borsboom, D., Mellenbergh, G. J., & Heerden, J. (2004). The concept of validity. Psychological Review, 111(4), 1061-71. doi: 10.1037/0033-295X.111.4.1061
  6. Bowen, N. K., Bowen, G. L., & Woolley, M. E. (2004). Constructing and validating assessment tools for school-based practitioners: The Elementary School Success Profile. In A. R. Roberts & K. Y. Yeager (Eds.) Evidence-based practice manual: Research and outcome measures in health and human services (pp. 509-517). New York: Oxford University Press.
  7. Conrad, F., & Blair, J. (2004). Aspects of data quality in cognitive interviews: The case of verbal reports. In S. Presser, J. Rothgeb, M. Couper, J. Lessler, E. Martin, J. Martin, et al. (Eds.) Questionnaire development, evaluation and testing methods (pp. 67-88). New York: Wiley.
  8. Cronbach, L. J. (1984). Essentials of psychological testing. NY: Harper.

Details

Primary Language

English

Subjects

Studies on Education

Journal Section

Research Article

Publication Date

March 21, 2019

Submission Date

July 25, 2018

Acceptance Date

November 19, 2018

Published in Issue

Year 2019 Volume: 6 Number: 1

APA
Kutlu, O., & Yavuz, H. C. (2019). An Effective Way to Provide Item Validity: Examining Student Response Processes. International Journal of Assessment Tools in Education, 6(1), 9-24. https://doi.org/10.21449/ijate.447780

Cited By

23823             23825             23824