Research Article
BibTex RIS Cite

An Effective Way to Provide Item Validity: Examining Student Response Processes

Year 2019, , 9 - 24, 21.03.2019
https://doi.org/10.21449/ijate.447780

Abstract

Studies based on response processes of individuals can provide information that supports the assessment and increases the validity of the items in the scale or tests. The purpose of this study is to present the extent to which the student response processes are effective in identifying and developing the characteristics of the items in an achievement test and in collecting validity evidence. For this purpose, 28 Turkish fourth-grade students were chosen, half were high-achieving students and the remaining half were low-achieving students. The items for the study were chosen from the Trends in International Mathematics and Science Study TIMSS 2007 and 2011 by taking into consideration several item characteristics. Before cognitive interviews, an interview guide was also prepared. In the study, it was determined that cognitive interviews, especially those conducted with the high-achieving students, can serve to develop item validity. In the cognitive interviews with the low-achieving students, information was gathered concerning how students who did not have specific knowledge measured with an item were able to respond to that item.

References

  • AERA, APA, & NCME (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
  • AERA, APA, & NCME (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
  • Beddow, P. A., Elliott, S. N., & Kettler, R. J. (2013). Test accessibility: Item reviews and lessons learned from four state assessments. Education Research International, 2013, 1-12. doi:10.1155/2013/952704
  • Benitez, I., & Padilla, J. L. (2013). Analysis of nonequivalent assessments across different linguistic groups using a mixed methods approach: Understanding the causes of differential item functioning by cognitive interviewing. Journal of Mixed Methods Research, 8(1) 52-6. doi: 10.1177/1558689813488245
  • Borsboom, D., Mellenbergh, G. J., & Heerden, J. (2004). The concept of validity. Psychological Review, 111(4), 1061-71. doi: 10.1037/0033-295X.111.4.1061
  • Bowen, N. K., Bowen, G. L., & Woolley, M. E. (2004). Constructing and validating assessment tools for school-based practitioners: The Elementary School Success Profile. In A. R. Roberts & K. Y. Yeager (Eds.) Evidence-based practice manual: Research and outcome measures in health and human services (pp. 509-517). New York: Oxford University Press.
  • Conrad, F., & Blair, J. (2004). Aspects of data quality in cognitive interviews: The case of verbal reports. In S. Presser, J. Rothgeb, M. Couper, J. Lessler, E. Martin, J. Martin, et al. (Eds.) Questionnaire development, evaluation and testing methods (pp. 67-88). New York: Wiley.
  • Cronbach, L. J. (1984). Essentials of psychological testing. NY: Harper.
  • Desimone, L., & Le Floch, K. C. (2004). Are we asking the right questions? Using cognitive interviews to improve surveys in education research. Educational Evaluation and Policy Analysis, 26(1), 1-22. doi:10.3102/01623737026001001
  • DeWalt, D. A., Rothrock, N., Yount, S., et al. (2007) Evaluation of item candidates: The PROMIS qualitative item review. Medical Care, 45(1), 12-21. doi: 10.1097/01.mlr.0000254567.79743.e2
  • Ding, L., Reay, N. W., Lee, A., & Bao, L. (2009). Are we asking the right questions? Validating clicker question sequences by student interviews. American Journal of Physics, 77(7), 643-650. doi:10.1119/1.3116093
  • Ercikan, K., Arim, R., & Law, D. (2010). Application on think aloud protocols for examining and confi rming sources of differential item functioning identified by experts review. Educational Measurement: Issues and Practices, 29, 24-35. doi:10.1111/j.1745-3992.2010.00173.x
  • Goldstein, H. (2008). How may we use international comparative studies to inform education policy. Retrieved from http://www.bristol.ac.uk/media-library/sites/cmm/migrated/documents/how-useful-are-international-comparative-studies-in-education.pdf
  • Haladyna, T. M. (1996). Developing and validating multiple-choice test items. NJ: Lawrence Erlbaum associates, publishers.
  • Haladyna, T. M. (1997). Writing test items to evaluate higher order thinking. USA: Allyn & Bacon.
  • Hopfenbeck, T. N., & Maul, A. (2011) Examining evidence for the validity of PISA Learning Strategy Scales based on student response processes. International Journal of Testing, 11(2), 95-121. doi: 10.1080/15305058.2010.529977
  • Johnstone, C., Figueroa, C., Yigal, A., Stone, E., & Laitusis, C. (2013). Results of a cognitive interview study of immediate feedback and revision opportunities for students with disabilities in large scale assessments (Synthesis Report 92). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.
  • Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50(1), 1-73. doi: 10.1111/jedm.12000
  • Linn, R. L. (1989). Educational measurement. NJ: American Council on Education and Macmillan Publishing Company.
  • Linn, R. L., & Gronlund, N. E. (1995). Measurement and assessment in teaching (7th ed.). Englewood Cliffs, New Jersey; Prentice Hall.
  • Lissitz, W. R., & Samuelsen, K. (2007). A suggested change in terminology and emphasis regarding validity and education. Educational Researcher, 36(8), 437-448.
  • Nicolaidis, C., Chienello, T., & Gerrity, M. (2011). The development and initial psychometric assessment of the centrality of Pain Scale. Pain Medicine, 12, 612-617.
  • Nitko, A. J., & Brookhart, S. M. (2007). Educational assessment of students (5th ed.). Upper Saddle River, NJ: Prentice-Hall.
  • Noble, T., Rosebery, A., Suarez, C., Warren, B., & O’Connor, M. C. (2014). Science assessments and English language learners: Validity evidence based on response processes. Applied Measurement in Education, 27(4), 248–260.
  • Osterlind, S. J. (2002). Constructing test items: Multiple-choice, constructed-response, performance and other formats. New York: Kluwer Academic Publishers.
  • Ouimet, J. A., Bunnage, J. C., Carini, R. M., Kuh, G. D., & Kennedy, J. (2004). Using focus groups, expert advice, and cognitive interviews to establish the validity of a college student survey. Research in Higher Education, 45(3), 233-250.
  • Padilla, J. L., & Benítez, I. (2014). Validity evidence based on response processes. Psicothema, 26, 136-144. doi: 10.7334/psicothema2013.259
  • Pehlivan Tunç, E. B., & Kutlu, Ö. (2014). Investigation of Answering Behaviour in Turkish Test. Journal of Measurement and Evaluation in Education and Psychology, 5(1), 61-71.
  • Peterson, C. H., Peterson, N. A., & Powell, K. G. (2017). Cognitive interviewing for item development: Validity evidence based on content and response processes, Measurement and Evaluation in Counseling and Development, 50(4), 217-223, doi: 10.1080/07481756.2017.1339564
  • Ryan, K., Gannon-Slater, N., & Culbertson, M. J. (2012). Improving survey methods with cognitive interviews in small- and medium-scale evaluations. American Journal of Evaluation, 33(3), 414-30. doi:10.1177/1098214012441499
  • Sireci, S., & Faulkner-Bold, M. (2014). Validity evidence based on test content. Psicothema, 26, 1, 100-107. doi: 10.7334/psicothema2013.256
  • Sireci, S. G. (2007). On validity theory and test validation. Educational Researcher, 36(8), 477-481. doi: 10.3102/0013189X07311609
  • Snow, E. & Katz, I. (2009). Using cognitive interviews to validate an interpretive argument for the ETS ISKILLS assessment. Communications in Information Literacy, 3(2), 99-127.
  • Strauss, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Newbury Park, CA: Sage Publications, Inc.
  • Tourangeau, R., Rips, L., & Rasinski, K. (2000). The psychology of survey response. NY: Cambridge University Press.
  • Towns, M. H. (2014). Guide to developing high-quality, reliable, and valid multiple-choice assessments. Journal of Chemical Education, 91(9), 1426-1431. doi: 10.1021/ed500076x
  • TIMSS 2007 Assessment. Copyright © 2009 International Association for the Evaluation of Educational Achievement (IEA). Publisher: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
  • TIMSS 2011 Assessment. Copyright © 2013 International Association for the Evaluation of Educational Achievement (IEA). Publisher: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
  • Wildy, H., & Clarke, S. (2009). Using cognitive interviews to pilot an international survey of principal preparation: A Western Australian perspective. Educational Assessment, Evaluation and Accountability, 21(2), 105-117. doi: 10.1007/s11092-009-9073-3
  • Willis, G. (2015). Analysis of the cognitive interview in questionnaire design (understanding qualitative research). NY: Oxford University Press.
  • Willis, G. B. (2005). Cognitive interviewing: A tool for improving questionnaire design. Thousand Oaks, CA: Sage.

An Effective Way to Provide Item Validity: Examining Student Response Processes

Year 2019, , 9 - 24, 21.03.2019
https://doi.org/10.21449/ijate.447780

Abstract

Studies
based on response processes of individuals can provide information that
supports the assessment and increases the validity of the items in the scale or
tests. The purpose of this study is to present the extent to which the student response
processes are effective in identifying and developing the characteristics of
the items in an achievement test and in collecting validity evidence. For this
purpose, 28 Turkish fourth-grade students were chosen, half were high-achieving
students and the remaining half were low-achieving students. The items for the
study were chosen from the Trends in International Mathematics and Science
Study TIMSS 2007 and 2011 by taking into consideration several item
characteristics. Before cognitive interviews, an interview guide was also
prepared. In the study, it was determined that cognitive interviews, especially
those conducted with the high-achieving students, can serve to develop item
validity. In the cognitive interviews with the low-achieving students, information
was gathered concerning how students who did not have specific knowledge
measured with an item were able to respond to that item.

References

  • AERA, APA, & NCME (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
  • AERA, APA, & NCME (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
  • Beddow, P. A., Elliott, S. N., & Kettler, R. J. (2013). Test accessibility: Item reviews and lessons learned from four state assessments. Education Research International, 2013, 1-12. doi:10.1155/2013/952704
  • Benitez, I., & Padilla, J. L. (2013). Analysis of nonequivalent assessments across different linguistic groups using a mixed methods approach: Understanding the causes of differential item functioning by cognitive interviewing. Journal of Mixed Methods Research, 8(1) 52-6. doi: 10.1177/1558689813488245
  • Borsboom, D., Mellenbergh, G. J., & Heerden, J. (2004). The concept of validity. Psychological Review, 111(4), 1061-71. doi: 10.1037/0033-295X.111.4.1061
  • Bowen, N. K., Bowen, G. L., & Woolley, M. E. (2004). Constructing and validating assessment tools for school-based practitioners: The Elementary School Success Profile. In A. R. Roberts & K. Y. Yeager (Eds.) Evidence-based practice manual: Research and outcome measures in health and human services (pp. 509-517). New York: Oxford University Press.
  • Conrad, F., & Blair, J. (2004). Aspects of data quality in cognitive interviews: The case of verbal reports. In S. Presser, J. Rothgeb, M. Couper, J. Lessler, E. Martin, J. Martin, et al. (Eds.) Questionnaire development, evaluation and testing methods (pp. 67-88). New York: Wiley.
  • Cronbach, L. J. (1984). Essentials of psychological testing. NY: Harper.
  • Desimone, L., & Le Floch, K. C. (2004). Are we asking the right questions? Using cognitive interviews to improve surveys in education research. Educational Evaluation and Policy Analysis, 26(1), 1-22. doi:10.3102/01623737026001001
  • DeWalt, D. A., Rothrock, N., Yount, S., et al. (2007) Evaluation of item candidates: The PROMIS qualitative item review. Medical Care, 45(1), 12-21. doi: 10.1097/01.mlr.0000254567.79743.e2
  • Ding, L., Reay, N. W., Lee, A., & Bao, L. (2009). Are we asking the right questions? Validating clicker question sequences by student interviews. American Journal of Physics, 77(7), 643-650. doi:10.1119/1.3116093
  • Ercikan, K., Arim, R., & Law, D. (2010). Application on think aloud protocols for examining and confi rming sources of differential item functioning identified by experts review. Educational Measurement: Issues and Practices, 29, 24-35. doi:10.1111/j.1745-3992.2010.00173.x
  • Goldstein, H. (2008). How may we use international comparative studies to inform education policy. Retrieved from http://www.bristol.ac.uk/media-library/sites/cmm/migrated/documents/how-useful-are-international-comparative-studies-in-education.pdf
  • Haladyna, T. M. (1996). Developing and validating multiple-choice test items. NJ: Lawrence Erlbaum associates, publishers.
  • Haladyna, T. M. (1997). Writing test items to evaluate higher order thinking. USA: Allyn & Bacon.
  • Hopfenbeck, T. N., & Maul, A. (2011) Examining evidence for the validity of PISA Learning Strategy Scales based on student response processes. International Journal of Testing, 11(2), 95-121. doi: 10.1080/15305058.2010.529977
  • Johnstone, C., Figueroa, C., Yigal, A., Stone, E., & Laitusis, C. (2013). Results of a cognitive interview study of immediate feedback and revision opportunities for students with disabilities in large scale assessments (Synthesis Report 92). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.
  • Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50(1), 1-73. doi: 10.1111/jedm.12000
  • Linn, R. L. (1989). Educational measurement. NJ: American Council on Education and Macmillan Publishing Company.
  • Linn, R. L., & Gronlund, N. E. (1995). Measurement and assessment in teaching (7th ed.). Englewood Cliffs, New Jersey; Prentice Hall.
  • Lissitz, W. R., & Samuelsen, K. (2007). A suggested change in terminology and emphasis regarding validity and education. Educational Researcher, 36(8), 437-448.
  • Nicolaidis, C., Chienello, T., & Gerrity, M. (2011). The development and initial psychometric assessment of the centrality of Pain Scale. Pain Medicine, 12, 612-617.
  • Nitko, A. J., & Brookhart, S. M. (2007). Educational assessment of students (5th ed.). Upper Saddle River, NJ: Prentice-Hall.
  • Noble, T., Rosebery, A., Suarez, C., Warren, B., & O’Connor, M. C. (2014). Science assessments and English language learners: Validity evidence based on response processes. Applied Measurement in Education, 27(4), 248–260.
  • Osterlind, S. J. (2002). Constructing test items: Multiple-choice, constructed-response, performance and other formats. New York: Kluwer Academic Publishers.
  • Ouimet, J. A., Bunnage, J. C., Carini, R. M., Kuh, G. D., & Kennedy, J. (2004). Using focus groups, expert advice, and cognitive interviews to establish the validity of a college student survey. Research in Higher Education, 45(3), 233-250.
  • Padilla, J. L., & Benítez, I. (2014). Validity evidence based on response processes. Psicothema, 26, 136-144. doi: 10.7334/psicothema2013.259
  • Pehlivan Tunç, E. B., & Kutlu, Ö. (2014). Investigation of Answering Behaviour in Turkish Test. Journal of Measurement and Evaluation in Education and Psychology, 5(1), 61-71.
  • Peterson, C. H., Peterson, N. A., & Powell, K. G. (2017). Cognitive interviewing for item development: Validity evidence based on content and response processes, Measurement and Evaluation in Counseling and Development, 50(4), 217-223, doi: 10.1080/07481756.2017.1339564
  • Ryan, K., Gannon-Slater, N., & Culbertson, M. J. (2012). Improving survey methods with cognitive interviews in small- and medium-scale evaluations. American Journal of Evaluation, 33(3), 414-30. doi:10.1177/1098214012441499
  • Sireci, S., & Faulkner-Bold, M. (2014). Validity evidence based on test content. Psicothema, 26, 1, 100-107. doi: 10.7334/psicothema2013.256
  • Sireci, S. G. (2007). On validity theory and test validation. Educational Researcher, 36(8), 477-481. doi: 10.3102/0013189X07311609
  • Snow, E. & Katz, I. (2009). Using cognitive interviews to validate an interpretive argument for the ETS ISKILLS assessment. Communications in Information Literacy, 3(2), 99-127.
  • Strauss, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Newbury Park, CA: Sage Publications, Inc.
  • Tourangeau, R., Rips, L., & Rasinski, K. (2000). The psychology of survey response. NY: Cambridge University Press.
  • Towns, M. H. (2014). Guide to developing high-quality, reliable, and valid multiple-choice assessments. Journal of Chemical Education, 91(9), 1426-1431. doi: 10.1021/ed500076x
  • TIMSS 2007 Assessment. Copyright © 2009 International Association for the Evaluation of Educational Achievement (IEA). Publisher: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
  • TIMSS 2011 Assessment. Copyright © 2013 International Association for the Evaluation of Educational Achievement (IEA). Publisher: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
  • Wildy, H., & Clarke, S. (2009). Using cognitive interviews to pilot an international survey of principal preparation: A Western Australian perspective. Educational Assessment, Evaluation and Accountability, 21(2), 105-117. doi: 10.1007/s11092-009-9073-3
  • Willis, G. (2015). Analysis of the cognitive interview in questionnaire design (understanding qualitative research). NY: Oxford University Press.
  • Willis, G. B. (2005). Cognitive interviewing: A tool for improving questionnaire design. Thousand Oaks, CA: Sage.
There are 41 citations in total.

Details

Primary Language English
Subjects Studies on Education
Journal Section Articles
Authors

Omer Kutlu This is me

Hatice Cigdem Yavuz 0000-0003-2585-3686

Publication Date March 21, 2019
Submission Date July 25, 2018
Published in Issue Year 2019

Cite

APA Kutlu, O., & Yavuz, H. C. (2019). An Effective Way to Provide Item Validity: Examining Student Response Processes. International Journal of Assessment Tools in Education, 6(1), 9-24. https://doi.org/10.21449/ijate.447780

23823             23825             23824